Fox, Jeffrey L
Jeremy Rifkin has filed a lawsuit to block U.S. Department of Agriculture (USDA) experiments involving the transfer of human growth hormone genes into sheep and pigs, which he rejects on environmental, economic, and ethical grounds. His real target is the Department's animal breeding program; his ultimate aim is "to establish the principle that there should be no crossing of species barriers in animals." USDA officials have not yet responded to the lawsuit but they intend to continue the experiments, which they consider crucial to the progress of research, until told to stop.
The Advanced Photon Research Center (APRC) of Japan Atomic Energy Research Institute is pursing the research and development of advanced photon sources such as a compact, ultra-short, high intensity laser, x-ray laser, and a superconducting linac-based free electron laser (FEL) and their applications. These compact and high-intensity lasers have various capabilities of producing radiations with distinguishing characteristics of ultra-short pulse, high coherence, etc. Hence, they can provide novel means of research in the field of nuclear energy applications and industrial and medical technologies. It is important for us to promote these researches on these high-intensity laser applications comprehensively and effectively under the collaborations with nationwide universities and industry. From this point of view it is expected that the APRC plays a role as a COE for these researches. Through these research activities for development of high-intensity lasers and their applications, we will develop ''photon science and technology'' as a leading key technology in the 21st century and contribute the development of science and technology including nuclear energy technology and production of new industries. (author)
Through training materials and guides, we aim to build skills and knowledge to enhance the quality of development research. We also offer free access to our database of funded research projects, known as IDRIS+, and our digital library. Our research tools include. Guide to research databases at IDRC: How to access and ...
Houghton, Catherine; Hunter, Andrew; Meskell, Pauline
To explore the use of paradigms as ontological and philosophical guides for conducting PhD research. A paradigm can help to bridge the aims of a study and the methods to achieve them. However, choosing a paradigm can be challenging for doctoral researchers: there can be ambiguity about which paradigm is suitable for a particular research question and there is a lack of guidance on how to shape the research process for a chosen paradigm. The authors discuss three paradigms used in PhD nursing research: post-positivism, interpretivism and pragmatism. They compare each paradigm in relation to its ontology, epistemology and methodology, and present three examples of PhD nursing research studies to illustrate how research can be conducted using these paradigms in the context of the research aims and methods. The commonalities and differences between the paradigms and their uses are highlighted. Creativity and flexibility are important when deciding on a paradigm. However, consistency and transparency are also needed to ensure the quality and rigour necessary for conducting nursing research. When choosing a suitable paradigm, the researcher should ensure that the ontology, epistemology and methodology of the paradigm are manifest in the methods and research strategies employed.
This book is an in-depth guide to effective scientific research. Ranging from the philosophical to the practical, it explains at the outset what science can – and can’t – achieve, and discusses its relationship to mathematics and laws. The author then pays extensive attention to the scientific method, including experimental design, verification, uncertainty and statistics. A major aim of the book is to help young scientists reflect upon the deeper aims of their work and make the best use of their talents in contributing to progress. To this end, it also includes sections on planning research, on presenting one’s findings in writing, as well as on ethics and the responsibilities of scientists. .
Obeid, Jihad S; Johnson, Layne M; Stallings, Sarah; Eichmann, David
Fostering collaborations across multiple disciplines within and across institutional boundaries is becoming increasingly important with the growing emphasis on translational research. As a result, Research Networking Systems that facilitate discovery of potential collaborators have received significant attention by institutions aiming to augment their research infrastructure. We have conducted a survey to assess the state of adoption of these new tools at the Clinical and Translational Science Award (CTSA) funded institutions. Survey results demonstrate that most CTSA funded institutions have either already adopted or were planning to adopt one of several available research networking systems. Moreover a good number of these institutions have exposed or plan to expose the data on research expertise using linked open data, an established approach to semantic web services. Preliminary exploration of these publically-available data shows promising utility in assessing cross-institutional collaborations. Further adoption of these technologies and analysis of the data are needed, however, before their impact on cross-institutional collaboration in research can be appreciated and measured.
Smith, Hilary A.; Haslett, Stephen J.
One approach to children's rights in research is to adopt a methodology that focuses on eliciting children's perspectives. Ensuring representative participation from all children allows a diversity of contexts to be reflected in the results, and points to ways in which improvements can be made in specific settings. In cultural contexts where…
Ebrahim, Nader Ale
“Research Tools” can be defined as vehicles that broadly facilitate research and related activities. “Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated ...
Alamir Costa Louro
Full Text Available The objective of this paper is to identify and discuss trends in tools and methods used in project risk management and its relationship to other matters, using current scientific articles. The focus isn´t in understanding how they work in technical terms, but think about the possibilities of deepening in academic studies, including making several suggestions for future research. Adjacent to the article there is a discussion about an alleged "one best way" imperative normativity approach. It was answered the following research questions: what subjects and theories are related to project risk management tools and methods? The first contribution is related to the importance of the academic Chris Chapman as an author who has more published and also more referenced in the survey. There are several contributions on various subjects such as: the perception of the existence of many conceptual papers; papers about construction industry, problematization of contracts according to agency theory, IT and ERPs issues. Other contributions came from the bibliometric method that brings lot of consolidated information about terms, topics, authors, references, periods and, of course, methods and tools about Project Risk Management.
Full Text Available Abstract Background There has been considerable interest recently in developing and evaluating interventions to increase research use by clinicians. However, most work has focused on medical practices; and nursing is not well represented in existing systematic reviews. The purpose of this article is to report findings from a systematic review of interventions aimed at increasing research use in nursing. Objective To assess the evidence on interventions aimed at increasing research use in nursing. Methods A systematic review of research use in nursing was conducted using databases (Medline, CINAHL, Healthstar, ERIC, Cochrane Central Register of Controlled Trials, and Psychinfo, grey literature, ancestry searching (Cochrane Database of Systematic Reviews, key informants, and manual searching of journals. Randomized controlled trials and controlled before- and after-studies were included if they included nurses, if the intervention was explicitly aimed at increasing research use or evidence-based practice, and if there was an explicit outcome to research use. Methodological quality was assessed using pre-existing tools. Data on interventions and outcomes were extracted and categorized using a pre-established taxonomy. Results Over 8,000 titles were screened. Three randomized controlled trials and one controlled before- and after-study met the inclusion criteria. The methodological quality of included studies was generally low. Three investigators evaluated single interventions. The most common intervention was education. Investigators measured research use using a combination of surveys (three studies and compliance with guidelines (one study. Researcher-led educational meetings were ineffective in two studies. Educational meetings led by a local opinion leader (one study and the formation of multidisciplinary committees (one study were both effective at increasing research use. Conclusion Little is known about how to increase research use in
Full Text Available In this article relevance of energy service agreement as a tool of energy efficiency raising has been proved. On the basis of analysis of researches aimed at energy service agreements, legislative base and procedural framework, examples of implementation of energy service agreements the key challenges have been defined, slowing down the development of energy services market in Russia. Possible ways of solving these problems have been shown and the necessity of complex approach to dealing with these issues has been drawn.
Kanne, Stephen M.; Mazurek, Micah O.; Sikora, Darryn; Bellando, Jayne; Branum-Martin, Lee; Handen, Benjamin; Katz, Terry; Freedman, Brian; Powell, Mary Paige; Warren, Zachary
The current study describes the development and psychometric properties of a new measure targeting sensitivity to change of core autism spectrum disorder (ASD) symptoms, the Autism Impact Measure (AIM). The AIM uses a 2-week recall period with items rated on two corresponding 5-point scales (frequency and impact). Psychometric properties were…
Full Text Available mmunity. 2006 Sep;25(3):343-8. (.png) (.svg) (.html) (.csml) Show Fifty years of interferon research: aiming at a moving target. Pubm...edID 16979566 Title Fifty years of interferon research: aiming at a moving target.
Pantula, Sastry; Dickey, David
Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...
Glasgow Russell E
Full Text Available Abstract Background Much has been written about how the medical home model can enhance patient-centeredness, care continuity, and follow-up, but few comprehensive aids or resources exist to help practices accomplish these aims. The complexity of primary care can overwhelm those concerned with quality improvement. Methods The RE-AIM planning and evaluation model was used to develop a multimedia, multiple-health behavior tool with psychosocial assessment and feedback features to facilitate and guide patient-centered communication, care, and follow-up related to prevention and self-management of the most common adult chronic illnesses seen in primary care. Results The Connection to Health Patient Self-Management System, a web-based patient assessment and support resource, was developed using the RE-AIM factors of reach (e.g., allowing input and output via choice of different modalities, effectiveness (e.g., using evidence-based intervention strategies, adoption (e.g., assistance in integrating the system into practice workflows and permitting customization of the website and feedback materials by practice teams, implementation (e.g., identifying and targeting actionable priority behavioral and psychosocial issues for patients and teams, and maintenance/sustainability (e.g., integration with current National Committee for Quality Assurance recommendations and clinical pathways of care. Connection to Health can work on a variety of input and output platforms, and assesses and provides feedback on multiple health behaviors and multiple chronic conditions frequently managed in adult primary care. As such, it should help to make patient-healthcare team encounters more informed and patient-centered. Formative research with clinicians indicated that the program addressed a number of practical concerns and they appreciated the flexibility and how the Connection to Health program could be customized to their office. Conclusions This primary care practice
Peek, Niels; Combi, Carlo; Marin, Roque; Bellazzi, Riccardo
Over the past 30 years, the international conference on Artificial Intelligence in MEdicine (AIME) has been organized at different venues across Europe every 2 years, establishing a forum for scientific exchange and creating an active research community. The Artificial Intelligence in Medicine journal has published theme issues with extended versions of selected AIME papers since 1998. To review the history of AIME conferences, investigate its impact on the wider research field, and identify challenges for its future. We analyzed a total of 122 session titles to create a taxonomy of research themes and topics. We classified all 734 AIME conference papers published between 1985 and 2013 with this taxonomy. We also analyzed the citations to these conference papers and to 55 special issue papers. We identified 30 research topics across 12 themes. AIME was dominated by knowledge engineering research in its first decade, while machine learning and data mining prevailed thereafter. Together these two themes have contributed about 51% of all papers. There have been eight AIME papers that were cited at least 10 times per year since their publication. There has been a major shift from knowledge-based to data-driven methods while the interest for other research themes such as uncertainty management, image and signal processing, and natural language processing has been stable since the early 1990s. AIME papers relating to guidelines and protocols are among the most highly cited. Copyright © 2015 Elsevier B.V. All rights reserved.
Green plants are the ultimate source of all resources required for man's life, his food, his clothes, and almost all his energy requirements. Primitive prehistoric man could live from the abundance of nature surrounding him. Man today, dominating nature in terms of numbers and exploiting its limited resources, cannot exist without employing his intelligence to direct natural evolution. Plant sciences, therefore, are not a matter of curiosity but an essential requirement. From such considerations, the IAEA and FAO jointly organized a symposium to assess the value of mutation research for various kinds of plant science, which directly or indirectly might contribute to sustaining and improving crop production. The benefit through developing better cultivars that plant breeders can derive from using the additional genetic resources resulting from mutation induction has been assessed before at other FAO/IAEA meetings (Rome 1964, Pullman 1969, Ban 1974, Ibadan 1978) and is also monitored in the Mutation Breeding Newsletter, published by IAEA twice a year. Several hundred plant cultivars which carry economically important characters because their genes have been altered by ionizing radiation or other mutagens, are grown by farmers and horticulturists in many parts of the world. But the benefit derived from such mutant varieties is without any doubt surpassed by the contribution which mutation research has made towards the advancement of genetics. For this reason, a major part of the papers and discussions at the symposium dealt with the role induced-mutation research played in providing insight into gene action and gene interaction, the organization of genes in plant chromosomes in view of homology and homoeology, the evolutionary role of gene duplication and polyploidy, the relevance of gene blocks, the possibilities for chromosome engineering, the functioning of cytroplasmic inheritance and the genetic dynamics of populations. In discussing the evolutionary role of
Diaz Andrade, Antonio
Although the advantages of case study design are widely recognised, its original positivist underlying assumptions may mislead interpretive researchers aiming at theory building. The paper discusses the limitations of the case study design for theory building and explains how grounded theory systemic process adds to the case study design. The…
Full Text Available The research tools refer to the resources researchers need to use in experimental work. In Biotechnology, these can include cell lines, monoclonal antibodies, reagents, animal models, growth factors, combinatorial chemistry libraries, drug and drug targets, clones and cloning tools (such as PCR, method, laboratory equipment and machines, database and computer software. Research tools therefore serve as basis for upstream research to improve the present product or process. There are several challenges in the way of using patented research tools. IP issues with regard to research tools are important and may sometime pose hindrance for researchers. Hence in the case of patented research tools, IPR issues can compose a major hurdle for technology development. In majority instances research tools are permitted through MTAs for academic research and for imparting education. TRIPS provides a provision for exception to patent rights for experimental use of patented technology in scientific research and several countries including India have included this provision in their patent legislation. For commercially important work, licensing of research tools can be based on royalty or one time lump sum payment. Some patent owners of important high-end research tools for development of platform technology create problems in licensing which can impede research. Usually cost of a commercially available research tool is built up in its price.
Cadogan, Cathal A; Ryan, Cristín; Hughes, Carmel
There is a growing emphasis on behavior change in intervention development programmes aimed at improving public health and healthcare professionals' practice. A number of frameworks and methodological tools have been established to assist researchers in developing interventions seeking to change healthcare professionals' behaviors. The key features of behavior change intervention design involve specifying the target group (i.e. healthcare professional or patient cohort), the target behavior and identifying mediators (i.e. barriers and facilitators) of behavior change. Once the target behavior is clearly specified and understood, specific behavior change techniques can then be used as the basis of the intervention to target identified mediators of behavior change. This commentary outlines the challenges for pharmacy practice-based researchers in targeting dispensing as a behavior when developing behavior change interventions aimed at pharmacists and proposes a definition of dispensing to consider in future research. Copyright © 2016 Elsevier Inc. All rights reserved.
Mahajan Ashwini; Prof. B.V. Jain; Dr Surajj Sarode
A centrifuge is a critical piece of equipment for the laboratory. Purpose of this study was to study research centrifuge in detail, its applications, uses in different branches and silent features. Their are two types of research centrifuge study here revolutionary research centrifuge and microprocessor research centrifuge. A centrifuge is a device that separates particles from a solution through use of a rotor. In biology, the particles are usually cells, sub cellular organelles, or large mo...
Munteanu, Cristian R; Gonzalez-Diaz, Humberto; Garcia, Rafael; Loza, Mabel; Pazos, Alejandro
The molecular information encoding into molecular descriptors is the first step into in silico Chemoinformatics methods in Drug Design. The Machine Learning methods are a complex solution to find prediction models for specific biological properties of molecules. These models connect the molecular structure information such as atom connectivity (molecular graphs) or physical-chemical properties of an atom/group of atoms to the molecular activity (Quantitative Structure - Activity Relationship, QSAR). Due to the complexity of the proteins, the prediction of their activity is a complicated task and the interpretation of the models is more difficult. The current review presents a series of 11 prediction models for proteins, implemented as free Web tools on an Artificial Intelligence Model Server in Biosciences, Bio-AIMS (http://bio-aims.udc.es/TargetPred.php). Six tools predict protein activity, two models evaluate drug - protein target interactions and the other three calculate protein - protein interactions. The input information is based on the protein 3D structure for nine models, 1D peptide amino acid sequence for three tools and drug SMILES formulas for two servers. The molecular graph descriptor-based Machine Learning models could be useful tools for in silico screening of new peptides/proteins as future drug targets for specific treatments.
del Álamo Guaus, Òscar
This document presents the analysis, development and implementation of tools aimed at the detection and the import of the songs currently most relevant to the database of Vericast, the music matching platform of BMAT that monitors and reports music usage across radios, televisions and club recordings. For this purpose, web scraping techniques together with an algorithm to detect those songs that are giving more matches in Vericast have been employed. Besides, it has been necessary to process ...
Stender, Vivien; Jankowski, Cedric; Hammitzsch, Martin; Wächter, Joachim
Established initiatives and organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. These infrastructures aim the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. In this regard, Research Data Management (RDM) gains importance and thus requires the support by appropriate tools integrated in these infrastructures. Different projects provide arbitrary solutions to manage research data. However, within two projects - SUMARIO for land and water management and TERENO for environmental monitoring - solutions to manage research data have been developed based on Free and Open Source Software (FOSS) components. The resulting framework provides essential components for harvesting, storing and documenting research data, as well as for discovering, visualizing and downloading these data on the basis of standardized services stimulated considerably by enhanced data management approaches of Spatial Data Infrastructures (SDI). In order to fully exploit the potentials of these developments for enhancing data management in Geosciences the publication of software components, e.g. via GitHub, is not sufficient. We will use our experience to move these solutions into the cloud e.g. as PaaS or SaaS offerings. Our contribution will present data management solutions for the Geosciences developed in two projects. A sort of construction kit with FOSS components build the backbone for the assembly and implementation of projects specific platforms. Furthermore, an approach is presented to stimulate the reuse of FOSS RDM solutions with cloud concepts. In further projects specific RDM platforms can be set-up much faster, customized to the individual needs and tools can be added during the run-time.
This article explores the role of drawing as a tool for reflection. It reports on a PhD research project that aims to identify and analyse the value that co-design processes can bring to participants and their communities. The research is associated with Leapfrog, a three-year project funded by the UK Arts and Humanities Research Council (AHRC).…
Full Text Available The research site “Vrchslatina” was established in the spring of 2009 with the aim of studying production processes and the structure of net primary productivity in young forest stands. The beech and spruce stands grown at the site were selected because they originated from natural regeneration and are nearly of the same age. In 2009, we established 5 research plots in each stand with the aim of measuring basic tree characteristics. Moreover, we excavated entire trees to construct allometric relations for the specific tree compartments. In the consecutive years (2010, 2011 and 2012, we also included grass communities dominated by Calamagrostis epigejos in our studies. Besides studying production processes of all tree compartments (i.e. for trees: foliages, branches, stem, coarse and fine roots, for grasses and herbs: below- and above-ground parts, we monitored several atmospheric characteristics, followed by soil characteristics and eventually added a measurement of soil respiration. The results indicated that forest stands (even though they were in their initial growth stages sequestrated much more carbon than the grass communities. Moreover, we proved the considerable influence of climatic conditions (especially the sum of precipitation in the particular years for net primary productivity.
Medication administration is an important and essential nursing function with the potential for dangerous consequences if errors occur. Not only must nurses understand the use and outcomes of administering medications they must be able to calculate correct dosages. Medication administration and dosage calculation education occurs across the undergraduate program for student nurses. Research highlights inconsistencies in the approaches used by academics to enhance the student nurse's medication calculation abilities. The aim of this integrative review was to examine the literature available on effective education strategies for undergraduate student nurses on medication dosage calculations. A literature search of five health care databases: Sciencedirect, Cinahl, Pubmed, Proquest, Medline to identify journal articles between 1990 and 2012 was conducted. Research articles on medication calculation educational strategies were considered for inclusion in this review. The search yielded 266 papers of which 20 meet the inclusion criteria. A total of 5206 student nurse were included in the final review. The review revealed educational strategies fell into four types of strategies; traditional pedagogy, technology, psychomotor skills and blended learning. The results suggested student nurses showed some benefit from the different strategies; however more improvements could be made. More rigorous research into this area is needed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Zwergal, Andreas; Brandt, Thomas; Magnusson, Mans; Kennard, Christopher
Vertigo is one of the most common complaints in medicine. Despite its high prevalence, patients with vertigo often receive either inappropriate or inadequate treatment. The most important reasons for this deplorable situation are insufficient interdisciplinary cooperation, nonexistent standards in diagnostics and therapy, the relatively rare translations of basic science findings to clinical applications, and the scarcity of prospective controlled multicenter clinical trials. To overcome these problems, the German Center for Vertigo and Balance Disorders (DSGZ) started an initiative to establish a European Network for Vertigo and Balance Research called DIZZYNET. The central aim is to create a platform for collaboration and exchange among scientists, physicians, technicians, and physiotherapists in the fields of basic and translational research, clinical management, clinical trials, rehabilitation, and epidemiology. The network will also promote public awareness and help establish educational standards in the field. The DIZZYNET has the following objectives as regards structure and content: to focus on multidisciplinary translational research in vertigo and balance disorders, to develop interdisciplinary longitudinal and transversal networks for patient care by standardizing and personalizing the management of patients, to increase methodological competence by implementing common standards of practice and quality management, to internationalize the infrastructure for prospective multicenter clinical trials, to increase recruitment capacity for clinical trials, to create a common data base for patients with vertigo and balance disorders, to offer and promote attractive educational and career paths in a network of cooperating institutions. In the long term, the DIZZYNET should serve as an internationally visible network for interdisciplinary and multiprofessional research on vertigo and balance disorders. It ideally should equally attract the afflicted patients and
Full Text Available It has been proved that the use of qualitative approach is one of the ways to further development of the accounting. The features of the concept of «quality of accounting information» in the Ukrainian legislation have been analyzed in the article. The author grounds the necessity of development of the normative document where the further ways of accounting on the basis of qualitative approach should be formulated. The article singles out two main groups of scientists who raised the issue of the need to improve the quality of accounting information. Points of view of each group of scientists have been grounded. The relationship between the quality of accounting information and the efficiency of management decisions have been analyzed. The article proves that the generation of quality information by accounting system creates the necessary preconditions for effective management decisions. General scientific and methodological reasons for research aimed at improving the quality of accounting information have been showen.
Maurício Mello Petrucio
Full Text Available The increase in population as well as the water resource demand has been intensifying the human influence in Peri Lagoon basin. A review on the availability of data concerning the ecology of Peri Lagoon was made, aiming at the development of new researches to understand the functioning of this ecosystem. This information can contribute to the elaboration of a conservation proposal and sustainable use of the Lagoon in the future. High cyanobacteria density (Cylindrospermopsis raciborskii was detected in the Lagoon waters, which happens to be a risky situation for the ecosystem’s health and consequently for the population. The review highlights a lack of available information about the dynamics, functioning and structure of aquatic communities, as well as their relationships with the surrounding area and the infl uence of abiotic factors. Series of continuum data with respect to time are also considered to be lacking. Educational, political and social practices in environmental conservation are necessary, aiming at the management and sustainable use of Peri Lagoon basin. These practices will guarantee water resource quality and availability for the current and future generations.
Curtis A. S. G.
Full Text Available Background and origins of research of Adam Curtis. One persisting theme has been the pursuit of different landscapes at different scales to discover the routes to explain how the body is built. His research life fell in a fortunate period during which techniques and concepts for investigating structure have improved year by year. His most fortunate encounter was with Michael Abercrombie and his views on the social behaviour of cells, aims for quantitation, and statistical testing. Adam worked in various environments - in turn Geology as an undergraduate, Biophysics Ph.D. in a Genetics department and various departments in turn from anatomy via zoology to Cell Biology. Adam started his Ph.D. work in cell adhesion, studying cell movement, trapping and reaggregation phenomena, having an early start from the physico-chemical viewpoint. He made quantitative measurements of cell adhesion by kinetic methods. Interference reflection microscopy (IRM and related optical interference techniques were brought into the field of biology by him. In turn this led with Chris Wilkinson, a long term colleague, to the use of micro- and nanofabrication for biological research. Polscope and photoelastic measurements were introduced to biology recently in his laboratory. One long term theme has been to map the adhesion of cells to substrates to discover contact areas. Early data came from IRM and then TIRF (Total Internal Reflection Fluorescence Microscopy and then from Forster Resonance Energy Microscopy (FRET. Another important theme was the time scale that needed to be measured - very short indeed in suspension. This was very difficult and has only become possible very recently but hydrodynamic calculation shows it must be very short. The attractions of the Derjagin-Landau-Verwey-Overbeek theory (DLVO theory are that they explain many features of biological adhesion. The main test of this theory depends upon the energy of the adhesion at various different separation
Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C
This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.
Schou, Lone; Høstrup, Helle; Lyngsø, Elin
schou l., høstrup h., lyngsø e.e., larsen s. & poulsen i. (2011) Validation of a new assessment tool for qualitative research articles. Journal of Advanced Nursing00(0), 000-000. doi: 10.1111/j.1365-2648.2011.05898.x ABSTRACT: Aim. This paper presents the development and validation of a new...... assessment tool for qualitative research articles, which could assess trustworthiness of qualitative research articles as defined by Guba and at the same time aid clinicians in their assessment. Background. There are more than 100 sets of proposals for quality criteria for qualitative research. However, we...... is the Danish acronym for Appraisal of Qualitative Studies. Phase 1 was to develop the tool based on a literature review and on consultation with qualitative researchers. Phase 2 was an inter-rater reliability test in which 40 health professionals participated. Phase 3 was an inter-rater reliability test among...
such previous work, two case studies are presented, in which drawings helped investigate the relationship between media technology users and two specific devices, namely television and mobile phones. The experiment generated useful data and opened for further consideration of the method as an appropriate HCI...... research tool....
Lintern, Fiona; Davies, Jamie; McGinty, Andrew; Fisher, Jeannine
The first cohort of a new MSc programme is due to complete the course in August 2014. During the three-year online course students conduct several pieces of action research in their classrooms. There is little research specifically related classroom practice in the pre-tertiary psychology classroom. The following describes the rationale and context of the MSc in the Teaching of Psychology and reports on three students’ final year research. The first considers the benefits of Psychology Applie...
Haara, Frode Olav; Bolstad, Oda Heidi; Jenssen, Eirik S.
The development of mathematical literacy in schools is of significant concern at the policy level, and research is an important source of information in this process. This review article focuses on areas of research interest identified in empirical projects on mathematical literacy, and how mathematical literacy in schools is approached by…
A new clinical trial aims to determine whether nivolumab, an immune checkpoint inhibitor, can improve control of cancer for patients with several types of tumors of the central nervous system (CNS). The CNS is composed of the brain and spinal cord and the cause of most CNS tumors in adults is unknown. Learn more...
Lindahl-Kiessling, K.; Ahlborg, U.; Bylin, G.; Ehrenberg, L.; Hemminki, K.; Lindell, B.; Nilsson, Robert; Bostroem, C.E.; Swarn, U.
The paper presents a new research program for assessment of health risks caused by air pollutants. It is important to develop general methods for quantitative risk assessments and to improve the scientific base materials. (KAE)
A diagnostics survey was made to provide a clear definition of advanced diagnostic needs and the limitations of current approaches in addressing those needs. Special attention was given to the adequacy with which current diagnostics are interfaced to signal processing/data acquisition devices and systems. Critical evaluations of selected alternative diagnostic techniques for future R and D activities are presented. The conceptual basis of the Aimed Magnetic Lead Gradiometric system as a current density/magnetic field diagnostic is established
Gritzay, O.; Kalchenko, O.; Klimova, N.; Razbudey, V.; Sanzhur, A.
Calculation results of an epithermal neutron source which can be created at the Kyiv Research Reactor (KRR) by means of placing of specially selected moderators, filters, collimators, and shielding into the 10-th horizontal experimental tube (so-called thermal column) are presented. The general Monte-Carlo radiation transport code MCNP4C , the Oak Ridge isotope generation code ORIGEN2  and the NJOY99  nuclear data processing system have been used for these calculations
Rivas-Medina, A.; Gutierrez, V.; Gaspar-Escribano, J. M.; Benito, B.
Results of a seismic risk assessment study are often applied and interpreted by users unspecialised on the topic or lacking a scientific background. In this context, the availability of tools that help translating essentially scientific contents to broader audiences (such as decision makers or civil defence officials) as well as representing and managing results in a user-friendly fashion, are on indubitable value. On of such tools is the visualization tool VISOR-RISNA, a web tool developed within the RISNA project (financed by the Emergency Agency of Navarre, Spain) for regional seismic risk assessment of Navarre and the subsequent development of emergency plans. The RISNA study included seismic hazard evaluation, geotechnical characterization of soils, incorporation of site effects to expected ground motions, vulnerability distribution assessment and estimation of expected damage distributions for a 10% probability of exceedance in 50 years. The main goal of RISNA was the identification of higher risk area where focusing detailed, local-scale risk studies in the future and the corresponding urban emergency plans. A geographic information system was used to combine different information layers, generate tables of results and represent maps with partial and final results. The visualization tool VISOR-RISNA is intended to facilitate the interpretation and representation of the collection of results, with the ultimate purpose of defining actuation plans. A number of criteria for defining actuation priorities are proposed in this work. They are based on combinations of risk parameters resulting from the risk study (such as expected ground motion and damage and exposed population), as determined by risk assessment specialists. Although the values that these parameters take are a result of the risk study, their distribution in several classes depends on the intervals defined by decision takers or civil defense officials. These criteria provide a ranking of
Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.
Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.
This article surveys extensively fusion development under the following topics: US research directions; inertial confinement fusion; foreign fusion efforts; fusion issues; fusion applications; and arguments for fusion development. Dr. Dingee points out that, despite persuasive arguments for development, fusion has as yet attracted no substantial constituency; and that winning greater support for fusion may thus require a considerable technical breakthrough (namely, proof of scientific feasibility or achievement of energy breakeven) - or a new focus on an energy source such as hybrids, which offer a nearer-term payoff than pure fusion. Dr. Dingee says the next major facility for magnetic confinement research (to be built in late 1980s) has not yet been selected, but will probably be an engineering test facility; there are similar plans for inertial confinement. Whichever type is chosen, the first experimental power reactor is scheduled for the first few years of the 2000's, this to be followed by commercial demonstration of fusion power in the 2010 to 2020 time frame. He points out, finally, that the complex technical and institutional issues are being considered in a climate in which the benefits of nuclear energy itself are being questioned; and that there is little doubt that future development is tied to overall decisions the nation will make regarding the value of nuclear energy
Full Text Available Perceived Organizational Support (POS which was popularized in the early 1990s, is conception that may have both positive and negative effect on the staff and organization. In many ways perceived organizational support can determine the continuity of an organization over the long term. This study look at relationship between human resources practices which is taken on five dimensions and perceived organizational support. An investigation has been conduct over bed and supplier industry in Kayseri. The research that was performed with 227 worker is concluded that there are positive relations between training and human resources politics practices and perceived organizational support as of dimensions and between human resource management practices and perceived organizational support as of general.
Akhtar-Schuster, M.; San Juan Mesonada, C.
Desert Net://www.european-desert net.eu) is an interdisciplinary scientific network which was established in October 2006 at the UN premises in Bonn, Germany, by a group of international scientists. The network strives to generate and enhance scientific knowledge and understanding of the biophysical and socio-economic processes of desertification. This international scientific network provides an international platform for scientifically based discussions and exchange of ideas, addressing knowledge gaps, and identifying research areas. Desert Net is also a think tank community which identifies issues and priorities for the sustainable development of dry lands. the paper outlines the current role of Desert Net in the international scientific community and it delineates its role to strengthen the Science/Policy Interface. (Author) 2 refs.
Walker, C. E.; Fersch, A.; Barringer, D.; Pompea, S. M.
In workshops on GLOBE at Night, teacher professional development has begun on using night sky brightness data and bat telemetry data to do scientific research in the classroom. The study looks at the effects of light pollution on the flight paths of threatened and endangered (T&E) bats between their day roosts and night foraging areas. A jump-start in getting secondary school students involved was the BioBlitz event in Tucson, Arizona in October 2011. During the 24-hour event, night Sky Quality Meter (SQM) data was taken across the Saguaro National Park West, through Tucson and across the Saguaro National Park East. The program had its beginning with a pair of Research Experiences for Undergraduates (REU) students and their advisor. Through the collaboration of the National Science Foundation's REU program, the National Optical Astronomy Observatory's GLOBE at Night program and the U.S. Arizona Game and Fish Department (AzGFD), two REU students along with their advisor used data from the GLOBE at Night project and telemetry tracking data of lesser long-nosed bats to study the effects of light pollution on the flight paths of the bats between their day roosts and night foraging areas around the city of Tucson, AZ. During the summer of 2010, the first REU student used the visual limiting magnitude data from GLOBE at Night and, with the assistance of the AzGFD, ran compositional analyses with respect to the bats' flight paths to determine whether the bats were selecting for or against flight through regions of particular night sky brightness levels. The bats selected for the regions in which the limiting sky magnitudes fell between the ranges of 2.8-3.0 to 3.6-3.8 and 4.4-4.6 to 5.0-5.2, suggesting that the lesser long-nosed bat can tolerate a fair degree of urbanization. Three areas of systematic uncertainty were identified of which 2 could be addressed the following summer. Due to a relatively large uncertainty in each individually measured visual limiting magnitude
Balogun, G.I.; Jonah, S.A.; Umar, I.M.
Safety culture has been defined as 'that assembly of characteristics and attitudes in organizations and individuals which establishes that as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance'. This paper briefly highlights efforts being made at the Centre for Energy Research and Training (CERT) towards realizing this broad objective as far as possible. To this end CERT realizes the need for instituted safety measures to reflect significant, site-specific peculiar characteristics of any generic reactor types. Consequently, standard procedures for pre-startup, startup and shutdown of NIRR-1 (a miniature neutron source reactor - MNSR) have been reviewed to reflect our local conditions and peculiarities. The review has revealed the need to incorporate important steps that impact on overall safety of the facility. For instance an interlocking system is being considered between NIRR-1 startup on the one hand and mandatory pre-startup measures on the other. Also a procedure has been put in place that would facilitate rapid response in the event of a rod-stuck-at-full-withdrawal incident. Furthermore, a program of automation of important analysis and design calculations of MNSRs is going on. Emphases are also placed, and deliberate efforts are being made, to ensure that a working atmosphere prevails that would foster the correct attitudinal approach to matters of reactor safety. A regime of constant dialogue and discussions amongst operating personnel has been factored into the overall operational program. (author)
Bales, R.; Dozier, J.; Famiglietti, J.; Fogg, G.; Hopmans, J.; Kirchner, J.; Meixner, T.; Molotch, N.; Redmond, K.; Rice, R.; Sickman, J.; Warwick, J.
different systems? (iv) How can the predictive ability for these responses be improved? The water resources question is then "how can new information inform decision-making aimed at achieving water resources sustainability?" The planning group is soliciting participation from the wider community with a stake in mountain hydrology and related fields, in order to develop a focused yet broadly useful infrastructure that will accelerate science scientific progress for years and decades to come.
Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.
Straightforward Statistics: Understanding the Tools of Research is a clear and direct introduction to statistics for the social, behavioral, and life sciences. Based on the author's extensive experience teaching undergraduate statistics, this book provides a narrative presentation of the core principles that provide the foundation for modern-day statistics. With step-by-step guidance on the nuts and bolts of computing these statistics, the book includes detailed tutorials how to use state-of-the-art software, SPSS, to compute the basic statistics employed in modern academic and applied researc
Reach, Gérard; Choleau, Carine
Carbohydrate counting is the most difficult component of functional insulin therapy. We thus designed a tool to facilitate carbohydrate counting of meals. The tool consists of an illustrated directory (16 x 10 cm, 119 pages) presenting 389 pictures of food, classified in 12 categories (breads, fruits, vegetables, etc.). For each photo, the name and mode of preparation of the foods are given, with the weight of the illustrated portion and its carbohydrate content as a multiple of 5 g. During the first phase of the study (3 days a week for 12 weeks), twelve patients with type 1 diabetes completed a precise food diary using a list and weight of all consumed foods. We were thus able to determine, for each of three meals (breakfast, lunch and dinner), the variability of their carbohydrate content. During the second phase of the study (2 weeks), the patients were given the possibility of using the illustrated food directory. We asked them first to estimate, from the photos, the global carbohydrate content of their meals, and then to weigh each food. This allowed us to calculate the true carbohydrate content of the meals from nutritional tables. During the first phase, the carbohydrate contents of breakfast, lunch and dinner were 67 +/- 29, 72 +/- 30 and 74 +/- 30 g, respectively (mean +/- SD, n = 12). For a given patient, the variability in the carbohydrate content of each meal was expressed by its standard deviation (SD). For the 12 patients, this variability was 18+/-+ 8, 25+/-+ 8 and 27+/-+ 11 g, respectively, for breakfast, lunch and dinner, and represented on the average about one-quarter of the total carbohydrate content. During the second phase, carbohydrate content, estimated by the patients using the illustrated food directory, correlated well with the retrospective evaluation based on nutritional tables (y = 0.95 x + 5 g, r2 = 0.8 ; n = 12, 235 meals). This new illustrated food repertory allows accurate evaluation of the highly variable carbohydrate content of
Kevin R. Butt
Full Text Available Earthworms are responsible for soil development, recycling organic matter and form a vital component within many food webs. For these and other reasons earthworms are worthy of investigation. Many technologically-enhanced approaches have been used within earthworm-focused research. These have their place, may be a development of existing practices or bring techniques from other fields. Nevertheless, let us not overlook the fact that much can still be learned through utilisation of more basic approaches which have been used for some time. New does not always equate to better. Information on community composition within an area and specific population densities can be learned using simple collection techniques, and burrowing behaviour can be determined from pits, resin-insertion or simple mesocosms. Life history studies can be achieved through maintenance of relatively simple cultures. Behavioural observations can be undertaken by direct observation or with low cost webcam usage. Applied aspects of earthworm research can also be achieved through use of simple techniques to enhance population development and even population dynamics can be directly addressed with use of relatively inexpensive, effective marking techniques. This paper seeks to demonstrate that good quality research in this sphere can result from appropriate application of relatively simple research tools.
Butt, K.R.; Grigoropoulou, N.
Earthworms are responsible for soil development, recycling organic matter and form a vital component within many food webs. For these and other reasons earthworms are worthy of investigation. Many technologically-enhanced approaches have been used within earthworm-focused research. These have their place, may be a development of existing practices or bring techniques from other fields. Nevertheless, let us not overlook the fact that much can still be learned through utilisation of more basic approaches which have been used for some time. New does not always equate to better. Information on community composition within an area and specific population densities can be learned using simple collection techniques, and burrowing behaviour can be determined from pits, resin-insertion or simple mesocosms. Life history studies can be achieved through maintenance of relatively simple cultures. Behavioural observations can be undertaken by direct observation or with low cost we became usage. Applied aspects of earthworm research can also be achieved through use of simple techniques to enhance population development and even population dynamics can be directly addressed with use of relatively inexpensive, effective marking techniques. This paper seeks to demonstrate that good quality research in this sphere can result from appropriate application of relatively simple research tools.
Full Text Available The new Internet technologies have infiltrated in a stunning way the academic environment, both at individual and at institutional level. Therefore, more and more teachers have started educational blogs, librarians are active on Twitter, other educational actors curate web content, students post on Instagram or Flickr, and university departments have Facebook pages and/or YouTube accounts etc. Today, the use of web technology has become “a legitimate activity in many areas of higher education” (Waycott, 2010 and a considerable shift to digital academic research has gradually occurred. Teachers are encouraging students to take up digital tools for research and writing, thus revealing new ways of using information and communication technologies for academic purposes and not just for socializing. The main objective of this paper is to investigate the effects of integrating diverse digital, Web 2.0 tools and resources and OERs/MOOCs in research and in the construction of students’ academic texts. We aim to stress the increasing influence of digital and online tools in academic research and writing. Teachers, specialists, and students alike are affected by this process. In order to show how, we explore the following issues: What is Research 2.0? Which digital/online tools have we used to assist our students? What are the challenges for academic research using digital / web 2.0 tools? And how do digital tools shape academic research?
Michael A. Langston
Full Text Available Despite staggering investments made in unraveling the human genome, current estimates suggest that as much as 90% of the variance in cancer and chronic diseases can be attributed to factors outside an individual’s genetic endowment, particularly to environmental exposures experienced across his or her life course. New analytical approaches are clearly required as investigators turn to complicated systems theory and ecological, place-based and life-history perspectives in order to understand more clearly the relationships between social determinants, environmental exposures and health disparities. While traditional data analysis techniques remain foundational to health disparities research, they are easily overwhelmed by the ever-increasing size and heterogeneity of available data needed to illuminate latent gene x environment interactions. This has prompted the adaptation and application of scalable combinatorial methods, many from genome science research, to the study of population health. Most of these powerful tools are algorithmically sophisticated, highly automated and mathematically abstract. Their utility motivates the main theme of this paper, which is to describe real applications of innovative transdisciplinary models and analyses in an effort to help move the research community closer toward identifying the causal mechanisms and associated environmental contexts underlying health disparities. The public health exposome is used as a contemporary focus for addressing the complex nature of this subject.
Katherine D. Seelman
Full Text Available The importance of public policy as a complementary framework for telehealth, telemedicine, and by association telerehabilitation, has been recognized by a number of experts. The purpose of this paper is to review literature on telerehabilitation (TR policy and research methodology issues in order to report on the current state of the science and make recommendations about future research needs. An extensive literature search was implemented using search terms grouped into main topics of telerehabilitation, policy, population of users, and policy specific issues such as cost and reimbursement. The availability of rigorous and valid evidence-based cost studies emerged as a major challenge to the field. Existing cost studies provided evidence that telehomecare may be a promising application area for TR. Cost studies also indicated that telepsychiatry is a promising telepractice area. The literature did not reference the International Classification on Functioning, Disability and Health (ICF. Rigorous and comprehensive TR assessment and evaluation tools for outcome studies are tantamount to generating confidence among providers, payers, clinicians and end users. In order to evaluate consumer satisfaction and participation, assessment criteria must include medical, functional and quality of life items such as assistive technology and environmental factors. Keywords: Telerehabilitation, Telehomecare, Telepsychiatry, Telepractice
Gläser, Jochen; Laudel, Grit
Qualitative research aimed at "mechanismic" explanations poses specific challenges to qualitative data analysis because it must integrate existing theory with patterns identified in the data. We explore the utilization of two methods—coding and qualitative content analysis—for the first steps in the
Lemmens, L.H.J.M.; Müller, V.N.L.S.; Arntz, A.; Huibers, M.J.H.
We present a systematic empirical update and critical evaluation of the current status of research aimed at identifying a variety of psychological mediators in various forms of psychotherapy for depression. We summarize study characteristics and results of 35 relevant studies, and discuss the extent
Full Text Available Lean production has been applied by organizations around the world as part of strategy to reduce waste in their value stream intending to improve performance. Despite the use of lean production concepts and tools have continually been introduced in different companies, the literature shows the most of application is identified in large companies than in Small-and Medium-sized Enterprises (SMEs. The reason can be the lack of financial and human resources available to the lean initiatives in these enterprises. However, a wide range of concepts and techniques of lean production can be applied without considerable financial investments. This paper aims to study the opportunities and difficulties of lean production applicability in SMEs. Case research was developed in three SMEs of industrial cluster of Jaú/SP and the results suggest that these companies face similar difficulties. Nevertheless, the study confirms that there are several opportunities to implement lean production concepts and techniques which can lead to improve performance in these companies.
Greene, Gretchen; Donley, J.; Rodney, S.; LAZIO, J.; Koekemoer, A. M.; Busko, I.; Hanisch, R. J.; VAO Team; CANDELS Team
The formation of galaxies and their co-evolution with black holes through cosmic time are prominent areas in current extragalactic astronomy. New methods in science research are building upon collaborations between scientists and archive data centers which span large volumes of multi-wavelength and heterogeneous data. A successful example of this form of teamwork is demonstrated by the CANDELS (Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey) and the Virtual Astronomical Observatory (VAO) collaboration. The CANDELS project archive data provider services are registered and discoverable in the VAO through an innovative web based Data Discovery Tool, providing a drill down capability and cross-referencing with other co-spatially located astronomical catalogs, images and spectra. The CANDELS team is working together with the VAO to define new methods for analyzing Spectral Energy Distributions of galaxies containing active galactic nuclei, and helping to evolve advanced catalog matching methods for exploring images of variable depths, wavelengths and resolution. Through the publication of VOEvents, the CANDELS project is publishing data streams for newly discovered supernovae that are bright enough to be followed from the ground.
This paper will discuss some of the tooling necessary to manufacture aluminum-based research reactor fuel plates. Most of this tooling is intended for use in a high-production facility. Some of the tools shown have manufactured more than 150,000 pieces. The only maintenance has been sharpening. With careful design, tools can be made to accommodate the manufacture of several different fuel elements, thus, reducing tooling costs and maintaining tools that the operators are trained to use. An important feature is to design the tools using materials with good lasting quality. Good tools can increase return on investment. (author)
This paper will discuss some of the tooling necessary to manufacture aluminum-based research reactor fuel plates. Most of this tooling is intended for use in a high-production facility. Some of the tools shown have manufactured more than 150,000 pieces. The only maintenance has been sharpening. With careful design, tools can be made to accommodate the manufacture of several different fuel elements, thus, reducing tooling costs and maintaining tools that the operators are trained to use. An important feature is to design the tools using materials with good lasting quality. Good tools can increase return on investment
Lemmens, Lotte H J M; Müller, Viola N L S; Arntz, Arnoud; Huibers, Marcus J H
We present a systematic empirical update and critical evaluation of the current status of research aimed at identifying a variety of psychological mediators in various forms of psychotherapy for depression. We summarize study characteristics and results of 35 relevant studies, and discuss the extent to which these studies meet several important requirements for mechanism research. Our review indicates that in spite of increased attention for the topic, advances in theoretical consensus about necessities for mechanism research, and sophistication of study designs, research in this field is still heterogeneous and unsatisfactory in methodological respect. Probably the biggest challenge in the field is demonstrating the causal relation between change in the mediator and change in depressive symptoms. The field would benefit from a further refinement of research methods to identify processes of therapeutic change. Recommendations for future research are discussed. However, even in the most optimal research designs, explaining psychotherapeutic change remains a challenge. Psychotherapy is a multi-dimensional phenomenon that might work through interplay of multiple mechanisms at several levels. As a result, it might be too complex to be explained in relatively simple causal models of psychological change. Copyright © 2016 Elsevier Ltd. All rights reserved.
Robinson, D.; Maggi, B.
The Education and Public Outreach (EPO) component of the satellite-based research mission "Aeronomy of Ice In the Mesosphere" (AIM) will bridge the unique scientific aspects of the mission to informal education organizations. The informal education materials developed by the EPO will utilize AIM data and educate the public about the environmental implications associated with the data. This will assist with creating a scientifically literate workforce and in developing a citizenry capable of making educated decisions related to environmental policies and laws. The objective of the AIM mission is to understand the mechanisms that cause Polar Mesospheric Clouds (PMCs) to form, how their presence affects the atmosphere, and how change in the atmosphere affects them. PMCs are sometimes known as Noctilucent Clouds (NLCs) because of their visibility during the night from appropriate locations. The phenomenon of PMCs is an observable indicator of global change, a concern to all citizens. Recent sightings of these clouds over populated regions have compelled AIM educators to expand informal education opportunities to communities worldwide. Collaborations with informal organizations include: Museums/Science Centers; NASA Sun-Earth Connection Forum; Alaska Native Ways of Knowing Project; Amateur Noctilucent Cloud Observers Organization; National Parks Education Programs; After School Science Clubs; Public Broadcasting Associations; and National Public Radio. The Native Ways of Knowing Project is an excellent example of informal collaboration with the AIM EPO. This Alaska based project will assist native peoples of the state with photographing NLCs for the EPO website. It will also aid the EPO with developing materials for informal organizations that incorporate traditional native knowledge and science, related to the sky. Another AIM collaboration that will offer citizens lasting informal education opportunities is the one established with the United States National Parks
Greene, Sarah M.; Baldwin, Laura-Mae; Dolor, Rowena J.; Thompson, Ella; Neale, Anne Victoria
Over the past two decades, the health research enterprise has matured rapidly, and many recognize an urgent need to translate pertinent research results into practice, to help improve the quality, accessibility, and affordability of U.S. health care. Streamlining research operations would speed translation, particularly for multi-site collaborations. However, the culture of research discourages reusing or adapting existing resources or study materials. Too often, researchers start studies and...
Andrea Calsamiglia Madurga
Full Text Available We present a theoretical and epistemological reflection on Forum Theater’s potential as a Research Tool. Our presence on social action and research has led us to a double reflection on qualitative research’s limitations on the affect studies and the Forum Theater’s potential as a research tool to tackle research about affects. After some specific experiences in action research (qualitative research on romantic love and gender violence, and the creation process of the Forum Theater “Is it a joke?”, we explore Forum Theatre’s possibilities as a research tool in the feminist epistemology framework.
Nijssen, E.J.; Frambach, R.T.
This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies’ turnover, (2) MR companies’ awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers’ perceptions of the influence of client
Nijssen, Edwin J.; Frambach, Ruud T.
This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies' turnover, (2) MR companies' awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers' perceptions of the influence of client
Techniques to quantify ephemeral gully erosion have been identified by USDA Natural Resources Conservation Service (NRCS) as one of gaps in current erosion assessment tools. One reason that may have contributed to this technology gap is the difficulty to quantify changes in channel geometry to asses...
The Volpe Center developed a marketing research primer which provides a guide to the approach, procedures, and research tools used by private industry in predicting consumer response. The final two chapters of the primer focus on the challenges of do...
O V Verkhodanov
Full Text Available We describe the current status of CATS (astrophysical CATalogs Support system, a publicly accessible tool maintained at Special Astrophysical Observatory of the Russian Academy of Sciences (SAO RAS (http://cats.sao.ru allowing one to search hundreds of catalogs of astronomical objects discovered all along the electromagnetic spectrum. Our emphasis is mainly on catalogs of radio continuum sources observed from 10 MHz to 245 GHz, and secondly on catalogs of objects such as radio and active stars, X-ray binaries, planetary nebulae, HII regions, supernova remnants, pulsars, nearby and radio galaxies, AGN and quasars. CATS also includes the catalogs from the largest extragalactic surveys with non-radio waves. In 2008 CATS comprised a total of about 109 records from over 400 catalogs in the radio, IR, optical and X-ray windows, including most source catalogs deriving from observations with the Russian radio telescope RATAN-600. CATS offers several search tools through different ways of access, e.g. via Web-interface and e-mail. Since its creation in 1997 CATS has managed about 105requests. Currently CATS is used by external users about 1500 times per day and since its opening to the public in 1997 has received about 4000 requests for its selection and matching tasks.
Cunningham, Barbara Jane; Hidecker, Mary Jo Cooley; Thomas-Stonell, Nancy; Rosenbaum, Peter
In this paper, we present our experiences - both successes and challenges - in implementing evidence-based classification tools into clinical practice. We also make recommendations for others wanting to promote the uptake and application of new research-based assessment tools. We first describe classification systems and the benefits of using them in both research and practice. We then present a theoretical framework from Implementation Science to report strategies we have used to implement two research-based classification tools into practice. We also illustrate some of the challenges we have encountered by reporting results from an online survey investigating 58 Speech-language Pathologists' knowledge and use of the Communication Function Classification System (CFCS), a new tool to classify children's functional communication skills. We offer recommendations for researchers wanting to promote the uptake of new tools in clinical practice. Specifically, we identify structural, organizational, innovation, practitioner, and patient-related factors that we recommend researchers address in the design of implementation interventions. Roles and responsibilities of both researchers and clinicians in making implementations science a success are presented. Implications for rehabilitation Promoting uptake of new and evidence-based tools into clinical practice is challenging. Implementation science can help researchers to close the knowledge-to-practice gap. Using concrete examples, we discuss our experiences in implementing evidence-based classification tools into practice within a theoretical framework. Recommendations are provided for researchers wanting to implement new tools in clinical practice. Implications for researchers and clinicians are presented.
Andersen, Jakob Axel Bejbro; Howard, Thomas J.; McAloone, Tim C.
. This paper elucidates the requirements for such tools by drawing on knowledge of the entrepreneurial phenomenon and by building on the existing research tools used in design research. On this basis, the development of a capture method for tech startup processes is described and its potential discussed....
Narratives and activity theory are useful as socially constructed data collection tools that allow a researcher access to the social, cultural and historical meanings that research participants place on events in their lives. This case study shows how these tools were used to promote reflection within a cultural-historical activity theoretically…
Full Text Available The paper presents particular results of the first phase of a research aimed at improving pre-graduate teacher training in the area of didactic technological competences. The main goal of the prepared research is to modernize and optimize relevant parts of study programs of teacher trainees at Slovak higher education institutions (inclusion and structure the relevant subjects in the study programs, their content and time assignment. The results are related to a questionnaire survey of the current state and perspectives of the continuing professional development of primary and secondary school teachers contributing to their didactic technological competences improvement and development. Main attention is paid to an analysis of the selected questionnaire items in which the respondents assessed significance of the use of various interactive educational activities and digital means in teaching process to inrease efficiency of selected specific aspects of education. The presented analysis is based on the segmentation of the respondents on the factor of the category and sub-category of the teaching staff the respondents belong to.
Lobashev, V.M.; Tavkhelidze, A.N.
A meson facility is being built at the Institute of Nuclear Research, USSR Academy of Sciences, in Troitsk, where the Scientific Center, USSR Academy of Sciences is located. The facility will include a linear accelerator for protons and negative hydrogen ions with 600 MeV energy and 0.5-1 mA beam current. Some fundamental studies that can be studied at a meson facility are described in the areas of elementary particles, neutron physics, solid state physics, and applied research. The characteristics of the linear accelerator are given and the meson facility's experimental complex is described
Ebrahim, Nader Ale
“Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated periodically. “Research Tools” consists of a hierarchical set of nodes. It has four main nodes: (1)...
Ebrahim, Nader Ale
“Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated periodically. “Research Tools” consists of a hierarchical set of nodes. It has four main nodes: (1)...
The aim of this paper is to propose a research tool in the field of education--the "metaphorical collage." This tool facilitates the understanding of concepts and processes in education through the analysis of metaphors in collage works that include pictorial images and verbal images. We believe the "metaphorical collage" to be…
View, Jenice L.; DeMulder, Elizabeth; Stribling, Stacia; Dodman, Stephanie; Ra, Sophia; Hall, Beth; Swalwell, Katy
This is a three-part essay featuring six teacher educators and one classroom teacher researcher. Part one describes faculty efforts to build curriculum for teacher research, scaffold the research process, and analyze outcomes. Part two shares one teacher researcher's experience using an equity audit tool in several contexts: her teaching practice,…
Fields, MaryAnne; Brewer, Ralph; Edge, Harris L.; Pusey, Jason L.; Weller, Ed; Patel, Dilip G.; DiBerardino, Charles A.
The Robotics Collaborative Technology Alliance (RCTA) program focuses on four overlapping technology areas: Perception, Intelligence, Human-Robot Interaction (HRI), and Dexterous Manipulation and Unique Mobility (DMUM). In addition, the RCTA program has a requirement to assess progress of this research in standalone as well as integrated form. Since the research is evolving and the robotic platforms with unique mobility and dexterous manipulation are in the early development stage and very expensive, an alternate approach is needed for efficient assessment. Simulation of robotic systems, platforms, sensors, and algorithms, is an attractive alternative to expensive field-based testing. Simulation can provide insight during development and debugging unavailable by many other means. This paper explores the maturity of robotic simulation systems for applications to real-world problems in robotic systems research. Open source (such as Gazebo and Moby), commercial (Simulink, Actin, LMS), government (ANVEL/VANE), and the RCTA-developed RIVET simulation environments are examined with respect to their application in the robotic research domains of Perception, Intelligence, HRI, and DMUM. Tradeoffs for applications to representative problems from each domain are presented, along with known deficiencies and disadvantages. In particular, no single robotic simulation environment adequately covers the needs of the robotic researcher in all of the domains. Simulation for DMUM poses unique constraints on the development of physics-based computational models of the robot, the environment and objects within the environment, and the interactions between them. Most current robot simulations focus on quasi-static systems, but dynamic robotic motion places an increased emphasis on the accuracy of the computational models. In order to understand the interaction of dynamic multi-body systems, such as limbed robots, with the environment, it may be necessary to build component
In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…
Research Tools can be found in TTC's Available Technologies and in scientific publications. They are freely available to non-profits and universities through a Material Transfer Agreement (or other appropriate mechanism), and available via licensing to companies.
Nathalie Sonck; Henk Fernee
Smartphones and apps offer an innovative means of collecting data from the public. The Netherlands Institute for Social Research | SCP has been engaged in one of the first experiments involving the use of a smartphone app to collect time use data recorded by means of an electronic diary. Is it feasible to use smartphones as a data collection tool for social research? What are the effects on data quality? Can we also incorporate reality mining tools in the smartphone app to replace traditional...
Price, Geoffrey P.; Wright, Vivian H.
Using John Creswell's Research Process Cycle as a framework, this article describes various web-based collaborative technologies useful for enhancing the organization and efficiency of educational research. Visualization tools (Cacoo) assist researchers in identifying a research problem. Resource storage tools (Delicious, Mendeley, EasyBib)…
Nishina, Kojiro; Nishihara, Hideaki; Mishima, Kaichiro
This meeting was held on March 4, 1993. Since the first power generation with the JPDR and the initial criticality of the KUR, 30 years, and since the initial criticality of the KUCA, 20 years have elapsed. The researchers in universities have contributed greatly to the research and education of atomic energy, but the perspective of leading the world hereafter in this field is very uncertain. This study meeting was held to seek the way to make the proper contribution. In the meeting, lectures were given on Japanese policy on nuclear fuel cycle, the present state of the upstream research and the downstream research in Japan, the experimental plan in NUCEF, the present state of the researches on TRU decay heat data and TRU nucleus data, the present state of the experimental researches at KUCA and at FCA, the present state of the research on the heat removal from high conversion LWRs and the KUR, the present state of the research on radioactive waste treatment, and the present state of TRU chemical research. The record of the holding of this study meeting is added. (K.I.)
Michael D. Coovert
Full Text Available Serious games are an attractive tool for education and training, but their utility is even broader. We argue serious games provide a unique opportunity for research as well, particularly in areas where multiple players (groups or teams are involved. In our paper we provide background in several substantive areas. First, we outline major constructs and challenges found in team research. Secondly, we discuss serious games, providing an overview and description of their role in education, training, and research. Thirdly, we describe necessary characteristics for game engines utilized in team research, followed by a discussion of the value added by utilizing serious games. Our goal in this paper is to argue serious games are an effective tool with demonstrated reliability and validity and should be part of a research program for those engaged in team research. Both team researchers and those involved in serious game development can benefit from a mutual partnership which is research focused.
Zhenyu, Zhao; Yongquan, Zhou; Houming, Zhou; Xiaomei, Xu; Haibin, Xiao
High speed machining technology can improve the processing efficiency and precision, but also reduce the processing cost. Therefore, the technology is widely regarded in the industry. With the extensive application of high-speed machining technology, high-speed tool system has higher and higher requirements on the tool chuck. At present, in high speed precision machining, several new kinds of clip heads are as long as there are heat shrinkage tool-holder, high-precision spring chuck, hydraulic tool-holder, and the three-rib deformation chuck. Among them, the heat shrinkage tool-holder has the advantages of high precision, high clamping force, high bending rigidity and dynamic balance, etc., which are widely used. Therefore, it is of great significance to research the new requirements of the machining tool system. In order to adapt to the requirement of high speed machining precision machining technology, this paper expounds the common tool holder technology of high precision machining, and proposes how to select correctly tool clamping system in practice. The characteristics and existing problems are analyzed in the tool clamping system.
Full Text Available Conversational analysis—situated between pragmatic linguistics and qualitative empirical research—is a complex method, which needs a lot of time and dedication. It is necessary to develop a so-called “analytical mentality”. The aim of the project presented in this paper was to develop the theoretical insights and the practical skills of a group of students for this kind of research. They worked together throughout the duration of the project, especially in the collec¬tion of empiric material: i.e. the recording of conversations between foreign and German stu¬dents, the transcription of the material, a group discussion on the data and finally its analysis. This articles aims at showing what students can learn by doing this kind of work, based on examples of the collected empirical material: (1 they will be introduced to the different levels and stages of the research process and have the chance to develop a methodical and methodological competence; (2 their general communicative competences and their special competences of the foreign language will increase, as well as (3 their knowledge of intercultural learning by working with authentic data of intercultural communication. So, for instance, stereotypes and how they have been constructed during the interaction may be analysed and precisely described on a micro-analytical level. URN: urn:nbn:de:0114-fqs0901335
Barrett, Neil E.; Liu, Gi-Zen
English has become the de facto language for communication in academia in many parts of the world, but English language learners often lack the language resources to make effective oral academic presentations. However, English for academic purposes (EAP) research is beginning to provide valuable insights into this emerging field. This literature…
Jane Maria Pancinha Costa
Full Text Available Based on awareness of material by Gramsci (1978, 1982 on hegemony, Freire (1979a, 1979b on cooperative contact, and Steiner (1975 on radical psychiatry, action research methodology was used by the researcher, who was also a psychotherapist, with 12 women attending two ongoing weekly psychotherapy groups in Brazil in order to raise their social consciousness of culturally-based oppression of women, particularly relating to work; to apply life script analysis as a therapeutic intervention within the groups; and to facilitate recognition by the women of the benefits of cooperative contact when seeking to liberate themselves from oppression. Individual structured interviews were conducted and the data from these was discussed within the groups, leading to the development of a model containing 6 levels of consciousness of oppression. Examples of oppression identified by the women are provided, with only 17% relating directly to sexual discrim-ination at work. Although the research was conducted many years ago (1987-1989, it is shown that problems still exist and the research method-ology could usefully be applied elsewhere.
Mora, J.C.; Robles, Beatriz [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas - CIEMAT (Spain); Bradshaw, Clare; Stark, Karolina [Stockholm University (Sweden); Sweeck, Liev; Vives i Batlle, Jordi [Belgian Nuclear Research Centre SCK-CEN (Belgium); Beresford, Nick [Centre for Ecology and Hydrology - CEH (United Kingdom); Thoerring, Havard; Dowdall, Mark [Norwegian Radiation Protection Authority - NRPA (Norway); Outola, Iisa; Turtiainen, Tuukka; Vetikko, Virve [STUK - Radiation and Nuclear Safety Authority (Finland); Steiner, Martin [Federal Office for Radiation Protection - BfS (Germany); Beaugelin-Seiller, Karine; Fevrier, Laureline; Hurtevent, Pierre; Boyer, Patrick [Institut de Radioprotection et de Surete Nucleaire - IRSN (France)
Interaction Matrices as a Tool for Prioritizing Radioecology Research J.C. Mora CIEMAT In 2010 the Strategy for Allied Radioecology (STAR) was launched with several objectives aimed towards integrating the radioecology research efforts of nine institutions in Europe. One of these objectives was the creation of European Radioecology Observatories. The Chernobyl Exclusion Zone (CEZ) and the Upper Silesian Coal Basin (USCB), a coal mining area in Poland, have been chosen after a selection process. A second objective was to develop a system for improving and validating the capabilities of predicting the behaviour of the main radionuclides existing at these observatories. Interaction Matrices (IM) have been used since the 1990's as a tool for developing ecological conceptual models and have also been used within radioecology. The Interaction Matrix system relies on expert judgement for structuring knowledge of a given ecosystem at the conceptual level and was selected for use in the STAR project. A group of experts, selected from each institution of STAR, designed two matrices with the main compartments for each ecosystem (a forest in CEZ and a lake in USCB). All the features, events and processes (FEPs) which could affect the behaviour of the considered radionuclides, focusing on radiocaesium in the Chernobyl forest and radium in the Rontok-Wielki lake, were also included in each IM. Two new sets of experts were appointed to review, improve and prioritize the processes included in each IM. A first processing of the various candidate interaction matrices produced a single interaction matrix for each ecosystem which incorporated all experts combined knowledge. During the prioritization of processes in the IMs, directed towards developing a whole predictive model of radionuclides behaviour in those ecosystems, raised interesting issues related to the processes and parameters involved, regarding the existing knowledge in them. This exercise revealed several processes
The "Jackson Nursery", existing from February 1937 until March 1938, was directed by Anna Freud and financed by Edith Jackson and Dorothy Burlingham. It took care of infants from the poorest strata of Vienna and also gave material support to their families. On the other hand, it was a training institution for psychoanalysts, offering the opportunity of observing children during their first two years, e. g. their feeding habits and social sense. In addition, the Jackson Nursery was a place for research where psychoanalytic theories of infantile development were checked against the findings of direct observation. The work started here was then continued by A. Freud and D. Burlingham on a larger scale in their War Nurseries.--This paper examines the many-sided activities in the nursery mainly on the basis of unpu blished archival documents.
Ebrahim, Nader Ale
“Research Tools” can be defined as vehicles that broadly facilitate research and related activities. “Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 800 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated...
Sun, Y. H.; Sainio, W. C.
Test results of the Aerothermodynamic Integration Model are presented. A program was initiated to develop a hydrogen-fueled research-oriented scramjet for operation between Mach 3 and 8. The primary objectives were to investigate the internal aerothermodynamic characteristics of the engine, to provide realistic design parameters for future hypersonic engine development as well as to evaluate the ground test facility and testing techniques. The engine was tested at the NASA hypersonic tunnel facility with synthetic air at Mach 5, 6, and 7. The hydrogen fuel was heated up to 1500 R prior to injection to simulate a regeneratively cooled system. The engine and component performance at Mach 6 is reported. Inlet performance compared very well both with theory and with subscale model tests. Combustor efficiencies up to 95 percent were attained at an equivalence ratio of unity. Nozzle performance was lower than expected. The overall engine performance was computed using two different methods. The performance was also compared with test data from other sources.
The CPTAC program develops new approaches to elucidate aspects of the molecular complexity of cancer made from large-scale proteogenomic datasets, and advance them toward precision medicine. Part of the CPTAC mission is to make data and tools available and accessible to the greater research community to accelerate the discovery process.
Harbottle, Jennifer; Strangward, Patrick; Alnuamaani, Catherine; Lawes, Surita; Patel, Sanjai; Prokop, Andreas
The "droso4schools" project aims to introduce the fruit fly "Drosophila" as a powerful modern teaching tool to convey curriculum-relevant specifications in biology lessons. Flies are easy and cheap to breed and have been at the forefront of biology research for a century, providing unique conceptual understanding of biology and…
Schell, Scott R
Surgical research is dependent upon information technologies. Selection of the computer, operating system, and software tool that best support the surgical investigator's needs requires careful planning before research commences. This manuscript presents a brief tutorial on how surgical investigators can best select these information technologies, with comparisons and recommendations between existing systems, software, and solutions. Privacy concerns, based upon HIPAA and other regulations, now require careful proactive attention to avoid legal penalties, civil litigation, and financial loss. Security issues are included as part of the discussions related to selection and application of information technology. This material was derived from a segment of the Association for Academic Surgery's Fundamentals of Surgical Research course.
Full Text Available AIM Data Services as a virtual facility provides virtual 3D reference tracks for simulation applications in the domain of automotive and railway systems. It offers tools for management and analysis of experiment data and a platform for survey and processing of vehicle data in the public transport domain. Collected spatial data is bundled in a database cluster and published through common web mapping interfaces.
Role plays are extremely valuable tools to address different aspects of teaching social responsibility, because they allow students to "live through" complex ethical decision making dilemmas. While role plays are getting high marks from students because their entertainment value is high, their educational value depends on their closeness to students' work experience and the skills of the teacher in helping students comprehend the lessons they are meant to convey.
Rodriguez, W. J.; Chaudhury, S. R.
Undergraduate research projects that utilize remote sensing satellite instrument data to investigate atmospheric phenomena pose many challenges. A significant challenge is processing large amounts of multi-dimensional data. Remote sensing data initially requires mining; filtering of undesirable spectral, instrumental, or environmental features; and subsequently sorting and reformatting to files for easy and quick access. The data must then be transformed according to the needs of the investigation(s) and displayed for interpretation. These multidimensional datasets require views that can range from two-dimensional plots to multivariable-multidimensional scientific visualizations with animations. Science undergraduate students generally find these data processing tasks daunting. Generally, researchers are required to fully understand the intricacies of the dataset and write computer programs or rely on commercially available software, which may not be trivial to use. In the time that undergraduate researchers have available for their research projects, learning the data formats, programming languages, and/or visualization packages is impractical. When dealing with large multi-dimensional data sets appropriate Scientific Visualization tools are imperative in allowing students to have a meaningful and pleasant research experience, while producing valuable scientific research results. The BEST Lab at Norfolk State University has been creating tools for multivariable-multidimensional analysis of Earth Science data. EzSAGE and SAGE4D have been developed to sort, analyze and visualize SAGE II (Stratospheric Aerosol and Gas Experiment) data with ease. Three- and four-dimensional visualizations in interactive environments can be produced. EzSAGE provides atmospheric slices in three-dimensions where the researcher can change the scales in the three-dimensions, color tables and degree of smoothing interactively to focus on particular phenomena. SAGE4D provides a navigable
Kolbæk, Raymond; Steensgaard, Randi; Angel, Sanne
. Furthermore we try to evidence-based the concept of "Sample handlings" and examines whether this concept can be used as a flexible methodological tool for developing workflow that promotes patient participation in their own rehabilitation. We use a action research design to identify actual problems, develop......, to test, evaluate and implement specific actions to promote patient participation in rehabilitation. Four nurses and four social and health assistants is having a "co-researcher" active role. The interaction with the researchers creates a reflexive and dynamic process with a learning and competence......Abstract Content: Major challenges occurs, when trying to implement research in clinical practice. In the West Danish Center for Spinal Cord Injury, we are doing a practice-based ph.d. project, that involves the practice field's own members as co-researchers. In the management of the project we use...
Full Text Available Abstract Background One of the consequences of the rapid and widespread adoption of high-throughput experimental technologies is an exponential increase of the amount of data produced by genome-wide experiments. Researchers increasingly need to handle very large volumes of heterogeneous data, including both the data generated by their own experiments and the data retrieved from publicly available repositories of genomic knowledge. Integration, exploration, manipulation and interpretation of data and information therefore need to become as automated as possible, since their scale and breadth are, in general, beyond the limits of what individual researchers and the basic data management tools in normal use can handle. This paper describes Genephony, a tool we are developing to address these challenges. Results We describe how Genephony can be used to manage large datesets of genomic information, integrating them with existing knowledge repositories. We illustrate its functionalities with an example of a complex annotation task, in which a set of SNPs coming from a genotyping experiment is annotated with genes known to be associated to a phenotype of interest. We show how, thanks to the modular architecture of Genephony and its user-friendly interface, this task can be performed in a few simple steps. Conclusion Genephony is an online tool for the manipulation of large datasets of genomic information. It can be used as a browser for genomic data, as a high-throughput annotation tool, and as a knowledge discovery tool. It is designed to be easy to use, flexible and extensible. Its knowledge management engine provides fine-grained control over individual data elements, as well as efficient operations on large datasets.
Stender, V.; Schroeder, M.; Wächter, J.
Established initiatives and mandated organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. The basic idea behind these infrastructures is the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. Especially the management of research data is gaining more and more importance. In geosciences these developments have to be merged with the enhanced data management approaches of Spatial Data Infrastructures (SDI). The Centre for GeoInformationTechnology (CeGIT) at the GFZ German Research Centre for Geosciences has the objective to establish concepts and standards of SDIs as an integral part of research infrastructure architectures. In different projects, solutions to manage research data for land- and water management or environmental monitoring have been developed based on a framework consisting of Free and Open Source Software (FOSS) components. The framework provides basic components supporting the import and storage of data, discovery and visualization as well as data documentation (metadata). In our contribution, we present our data management solutions developed in three projects, Central Asian Water (CAWa), Sustainable Management of River Oases (SuMaRiO) and Terrestrial Environmental Observatories (TERENO) where FOSS components build the backbone of the data management platform. The multiple use and validation of tools helped to establish a standardized architectural blueprint serving as a contribution to Research Infrastructures. We examine the question of whether FOSS tools are really a sustainable choice and whether the increased efforts of maintenance are justified. Finally it should help to answering the question if the use of FOSS for Research Infrastructures is a
Full Text Available This contribution describes ‘Research Game’, a game produced in a Lifelong Learning Programme-Comenius Project (The European Scientific Research Game which aims at motivating secondary school students through the experience of the excitement of scientific research. The project proposes practical and didactic works which combine theoretical activities with ICT in order to introduce students to the scientific research. Students collaborated internationally across Europe, to build hypotheses, carry out research, test the validity of their hypothesis and finalize a theory based on their findings. On the project platform (www.researchgame.eu/platform teachers and students registered, created a team, interacted on a forum space, played and learned science in a new innovative way. Here, the students shared their research findings with other groups of all Europe; finally competed online playing a serious game and showing to be able to apply the scientific method.
Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.
Full Text Available To develop and disseminate tools for interactive visualization of HIV cohort data.If a picture is worth a thousand words, then an interactive video, composed of a long string of pictures, can produce an even richer presentation of HIV population dynamics. We developed an HIV cohort data visualization tool using open-source software (R statistical language. The tool requires that the data structure conform to the HIV Cohort Data Exchange Protocol (HICDEP, and our implementation utilized Caribbean, Central and South America network (CCASAnet data.This tool currently presents patient-level data in three classes of plots: (1 Longitudinal plots showing changes in measurements viewed alongside event probability curves allowing for simultaneous inspection of outcomes by relevant patient classes. (2 Bubble plots showing changes in indicators over time allowing for observation of group level dynamics. (3 Heat maps of levels of indicators changing over time allowing for observation of spatial-temporal dynamics. Examples of each class of plot are given using CCASAnet data investigating trends in CD4 count and AIDS at antiretroviral therapy (ART initiation, CD4 trajectories after ART initiation, and mortality.We invite researchers interested in this data visualization effort to use these tools and to suggest new classes of data visualization. We aim to contribute additional shareable tools in the spirit of open scientific collaboration and hope that these tools further the participation in open data standards like HICDEP by the HIV research community.
Dojeiji, Sue; Byszewski, Anna; Wood, Tim
There is a paucity of evidence-based literature on the essential communication and collaboration skills to guide health care teams in conducting and assessing their performance in the Family Conference (FC). The authors developed and collected validity evidence for a rating scale of team FC performance, the Family Conference Rating Scale (FCRS). In phase 1, essential FC communication and collaboration skills were identified through a review of existing communication tools and literature on team functioning; a draft 34-item scale was developed. In phase 2, the scale was narrowed to a 6-category, 9-point scale with descriptors of expected behaviours through an iterative process: testing of the scale on 10 FC transcripts by two experts, soliciting feedback from a focus group of seven health care providers, and testing by non-experts on 49 live FCs. In phase 3, scores on the revised scale were validated by 10 health care providers from different disciplines by rating three videos of FCs of variable quality. Raters were able to detect inter-video variation in FC quality. The reliability of the FCRS was 0.95 and the inter-rater reliability, 0.68. The FCRS may enhance the ability of health professions educators to teach and assess interprofessional patient-centred communication and collaboration competencies.
Esmail, Aneez; Valderas, Jose M; Verstappen, Wim; Godycki-Cwirko, Maciek; Wensing, Michel
This paper is an introduction to a supplement to The European Journal of General Practice, bringing together a body of research focusing on the issue of patient safety in relation to primary care. The supplement represents the outputs of the LINNEAUS collaboration on patient safety in primary care, which was a four-year (2009-2013) coordination and support action funded under the Framework 7 programme by the European Union. Being a coordination and support action, its aim was not to undertake new research, but to build capacity through engaging primary care researchers and practitioners in identifying some of the key challenges in this area and developing consensus statements, which will be an essential part in developing a future research agenda. This introductory article describes the aims of the LINNEAUS collaboration, provides a brief summary of the reasons to focus on patient safety in primary care, the epidemiological and policy considerations, and an introduction to the papers included in the supplement.
Valeria Gisela Perez
Full Text Available This paper develops a reflection about the importance of the research of accounting subjects in the professional accountants training, this importance is an attribute of research to increase the wealth of discipline under investigation, this can be converted into a skill and/or competence wich accountants are required to demonstrate in their professional practice.Furthermore, accounting is recognized by the authors as a science in constant development, being able to be investigated. This change in knowledge is an element that motivates professionals to be constantly updated, becoming this aspect (constant updating the skill and competence that research can bring to professional training in university classrooms.The reflection is based on the study of documents developed by prestigious authors in accounting theory, teaching and research.Therefore, this paper concludes that research is a useful tool for the professional accounting training, and rewards the important skills and competencies for professional practice; it can be conceived as well as a strategy for technical and educational activities that allows students to recreate knowledge, allowing future updates that will require their professional practice.Key words: Accounting research, university teaching, accounting education.
Weingart, R.C.; Chau, H.H.; Goosman, D.R.; Hofer, W.W.; Honodel, C.A.; Lee, R.S.; Steinberg, D.J.; Stroud, J.R.
We have developed a new tool for ultrahigh-pressure research at LLL. This system, which we call the electric gun, has already achieved thin flyer plate velocities in excess of 20 km/s and pressures of the order of 2 TPa in tantalum. We believe that the electric gun is competitive with laser- and nuclear-driven methods of producing shocks in the 1-to-5 TPa range because of its precision and ease and economy of operation. Its development is recommended for shock initiation studies, dry runs for Site 300 hydroshots, and as a shock wave generator for surface studies
Full Text Available Introduction: This paper describes the development of a ‘Research for Impact’ Tool against a background of concerns about the over-researching of Aboriginal and Torres Strait Islander people’s issues without demonstrable benefits.Material and Methods: A combination of literature reviews, workshops with researchers and reflections by project team members and partners using participatory snowball techniques.Results: Assessing research impact is difficult, akin to so-called ‘wicked problem’, but not impossible. Heuristic and collaborative approach to research that takes in the expectations of research users, those being researched and the funders of research offers a pragmatic solution to evaluating research impact. The proposed ‘Research for Impact’ Tool is based on the understanding that the value of research is to create evidence and/or products to support smarter decisions so as to improve the human condition.Research is of limited value unless the evidence produced is used to inform smarter decisions. A practical way of approaching research impact is therefore to start with the decisions confronting decision makers whether they are government policymakers, professional practitioners or households and the extent to which the research supports smarter decisions and the knock-on consequences of such smart decisions. Embedded at each step in the impact planning, monitoring and evaluation process is the need for Indigenous leadership and participation, capacity enhancement and collaborative partnerships and participatory learning by doing approaches across partners.Discussion: The tool is designed in the context of Indigenous research but the basic idea that the way to assess research impact is to start upfront by defining the users’ of research and their information needs, the decisions confronting them and the extent to which research informs smarter decisions is equally applicable to research in other settings, both applied and
Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.
Roeder, L.; Jundt, R.
Sponsored by the Department of Energy, the ARM Climate Research Facility is a global scientific user facility for the study of climate change. To publicize progress and achievements and to reach new users, the ACRF uses a variety of Web 2.0 tools and strategies that build off of the program’s comprehensive and well established News Center (www.arm.gov/news). These strategies include: an RSS subscription service for specific news categories; an email “newsletter” distribution to the user community that compiles the latest News Center updates into a short summary with links; and a Facebook page that pulls information from the News Center and links to relevant information in other online venues, including those of our collaborators. The ACRF also interacts with users through field campaign blogs, like Discovery Channel’s EarthLive, to share research experiences from the field. Increasingly, field campaign Wikis are established to help ACRF researchers collaborate during the planning and implementation phases of their field studies and include easy to use logs and image libraries to help record the campaigns. This vital reference information is used in developing outreach material that is shared in highlights, news, and Facebook. Other Web 2.0 tools that ACRF uses include Google Maps to help users visualize facility locations and aircraft flight patterns. Easy-to-use comment boxes are also available on many of the data-related web pages on www.arm.gov to encourage feedback. To provide additional opportunities for increased interaction with the public and user community, future Web 2.0 plans under consideration for ACRF include: evaluating field campaigns for Twitter and microblogging opportunities, adding public discussion forums to research highlight web pages, moving existing photos into albums on FlickR or Facebook, and building online video archives through YouTube.
Hartley, D.S. III
This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.
Martínez Ruiz, María Ángeles; Ávalos Ramos, María Alejandra; Merma Molina, Gladys
The aim of this study is to analyse the metaphorical expressions designed by Science of Sport and Physical Activity university students, as a tool of inquiring two research questions: their perceptions of their physical education teachers, and the meaning physical activity has in students’ personal life. 51 students from the University of Alicante have participated in the study. Qualitative data analysis software AQUAD 6 was used for data processing. The results obtained from the analysis of ...
chose CHOP as our primary target for drug development. Currently, we are in the process of screening a library of 1274 drugs, all of which are already used in human subjects, for CHOP inhibitors. The last topic of our discussion is future possibilities for glaucoma management. First, we discuss the development of next generation in vivo imaging modalities that allow detailed description of pathomechanisms of this multifactorial disease, glaucoma. The purpose of this research was to improve the efficacy of glaucoma diagnosis and to visualize its pathology at a cellular/molecular level and develop molecule-specific therapies. Currently available visual field tests are subjective, since they rely on a determination of the threshold of light perception, and are affected by poor reproducibility. The current dependence on visual field tests to ascertain the progression of glaucoma is thus a serious limitation on an important task of ophthalmologists. We, therefore, turned our focus to the establishment of an in vivo imaging method to detect dying retinal ganglion cells, which would highlight the pathologic state of glaucoma with high sensitivity. To this end, we used confocal scanning ophthalmoscopy to assess the usefulness of SYTOX Orange as a cell death probe. Our results showed that this probe could reveal dying retinal ganglion cells clearly, quickly and with high sensitivity. We, therefore, believe that the clinical application of probes that can sensitively detect dying retinal ganglion cells is a highly promising approach. This also applies to the use of molecular tools that can provide information on the molecular pathology of glaucoma. Finally, we would like to introduce our national collaborative work on the analysis of "big-data". The project aims to collect as wide a range of data as possible at an unprecedented scale. The data to be registered ranges from basic glaucoma data, such as IOP and visual field test results, to data from the most sophisticated
Software Tools for Battery Design Software Tools for Battery Design Under the Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) project, NREL has developed software tools to help using CAEBAT software tools. Knowledge of the interplay of multi-physics at varied scales is imperative
Kaczmarczyk, Lech; Jackson, Walker S
The humble house mouse has long been a workhorse model system in biomedical research. The technology for introducing site-specific genome modifications led to Nobel Prizes for its pioneers and opened a new era of mouse genetics. However, this technology was very time-consuming and technically demanding. As a result, many investigators continued to employ easier genome manipulation methods, though resulting models can suffer from overlooked or underestimated consequences. Another breakthrough, invaluable for the molecular dissection of disease mechanisms, was the invention of high-throughput methods to measure the expression of a plethora of genes in parallel. However, the use of samples containing material from multiple cell types could obfuscate data, and thus interpretations. In this review we highlight some important issues in experimental approaches using mouse models for biomedical research. We then discuss recent technological advances in mouse genetics that are revolutionising human disease research. Mouse genomes are now easily manipulated at precise locations thanks to guided endonucleases, such as transcription activator-like effector nucleases (TALENs) or the CRISPR/Cas9 system, both also having the potential to turn the dream of human gene therapy into reality. Newly developed methods of cell type-specific isolation of transcriptomes from crude tissue homogenates, followed by detection with next generation sequencing (NGS), are vastly improving gene regulation studies. Taken together, these amazing tools simplify the creation of much more accurate mouse models of human disease, and enable the extraction of hitherto unobtainable data.
Kim, Jong-Won; Kim, Dogyun
Dosimetry tools for proton therapy research have been developed to measure the properties of a therapeutic proton beam. A CCD camera-scintillation screen system, which can verify the 2D dose distribution of a scanning beam and can be used for proton radiography, was developed. Also developed were a large area parallel-plate ionization chamber and a multi-layer Faraday cup to monitor the beam current and to measure the beam energy, respectively. To investigate the feasibility of locating the distal dose falloff in real time during patient treatment, a prompt gamma measuring system composed of multi-layer shielding structures was then devised. The system worked well for a pristine proton beam. However, correlation between the distal dose falloff and the prompt gamma distribution was blurred by neutron background for a therapy beam formed by scattering method. We have also worked on the design of a Compton camera to image the 2D distribution of prompt gamma rays.
Leggett, Graham J
Continued progress in the fast-growing field of nanoplasmonics will require the development of new methods for the fabrication of metal nanostructures. Optical lithography provides a continually expanding tool box. Two-photon processes, as demonstrated by Shukla et al. (doi: 10.1021/nn103015g), enable the fabrication of gold nanostructures encapsulated in dielectric material in a simple, direct process and offer the prospect of three-dimensional fabrication. At higher resolution, scanning probe techniques enable nanoparticle particle placement by localized oxidation, and near-field sintering of nanoparticulate films enables direct writing of nanowires. Direct laser "printing" of single gold nanoparticles offers a remarkable capability for the controlled fabrication of model structures for fundamental studies, particle-by-particle. Optical methods continue to provide a powerful support for research into metamaterials.
Skonieczny, Łukasz; Rybiński, Henryk; Kryszkiewicz, Marzena; Niezgódka, Marek
This book is a selection of results obtained within three years of research performed under SYNAT—a nation-wide scientific project aiming at creating an infrastructure for scientific content storage and sharing for academia, education and open knowledge society in Poland. The book is intended to be the last of the series related to the SYNAT project. The previous books, titled “Intelligent Tools for Building a Scientific Information Platform” and “Intelligent Tools for Building a Scientific Information Platform: Advanced Architectures and Solutions”, were published as volumes 390 and 467 in Springer's Studies in Computational Intelligence. Its contents is based on the SYNAT 2013 Workshop held in Warsaw. The papers included in this volume present an overview and insight into information retrieval, repository systems, text processing, ontology-based systems, text mining, multimedia data processing and advanced software engineering, addressing the problems of implementing intelligent tools for building...
Barroso, Antonio Carlos de Oliveira; Menezes, Mario Olimpio de, E-mail: email@example.com, E-mail: firstname.lastname@example.org [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Grupo de Pesquisa em Gestao do Conhecimento Aplicada a Area Nuclear
Nowadays broad interest in a couple of inter linked subject areas can make the configuration of a research group to be much diversified both in terms of its components and of the binding relationships that glues the group together. That is the case of the research group for knowledge management and its applications to nuclear technology - KMANT at IPEN, a living entity born 7 years ago and that has sustainably attracted new collaborators. This paper describes the strategic planning of the group, its charter and credo, the present components of the group and the diversified nature of their relations with the group and with IPEN. Then the technical competencies and currently research lines (or programs) are described as well as the research projects, and the management scheme of the group. In the sequence the web-based management and collaboration tools are described as well our experience with their use. KMANT have experiment with over 20 systems and software in this area, but we will focus on those aimed at: (a) web-based project management (RedMine, ClockinIT, Who does, PhProjekt and Dotproject); (b) teaching platform (Moodle); (c) mapping and knowledge representation tools (Cmap, Freemind and VUE); (d) Simulation tools (Matlab, Vensim and NetLogo); (e) social network analysis tools (ORA, MultiNet and UciNet); (f) statistical analysis and modeling tools (R and SmartPLS). Special emphasis is given to the coupling of the group permanent activities like graduate courses and regular seminars and how newcomers are selected and trained to be able to enroll the group. A global assessment of the role the management strategy and available tool set for the group performance is presented. (author)
Barroso, Antonio Carlos de Oliveira; Menezes, Mario Olimpio de
Nowadays broad interest in a couple of inter linked subject areas can make the configuration of a research group to be much diversified both in terms of its components and of the binding relationships that glues the group together. That is the case of the research group for knowledge management and its applications to nuclear technology - KMANT at IPEN, a living entity born 7 years ago and that has sustainably attracted new collaborators. This paper describes the strategic planning of the group, its charter and credo, the present components of the group and the diversified nature of their relations with the group and with IPEN. Then the technical competencies and currently research lines (or programs) are described as well as the research projects, and the management scheme of the group. In the sequence the web-based management and collaboration tools are described as well our experience with their use. KMANT have experiment with over 20 systems and software in this area, but we will focus on those aimed at: (a) web-based project management (RedMine, ClockinIT, Who does, PhProjekt and Dotproject); (b) teaching platform (Moodle); (c) mapping and knowledge representation tools (Cmap, Freemind and VUE); (d) Simulation tools (Matlab, Vensim and NetLogo); (e) social network analysis tools (ORA, MultiNet and UciNet); (f) statistical analysis and modeling tools (R and SmartPLS). Special emphasis is given to the coupling of the group permanent activities like graduate courses and regular seminars and how newcomers are selected and trained to be able to enroll the group. A global assessment of the role the management strategy and available tool set for the group performance is presented. (author)
Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K
Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.
Full Text Available There are many ways in which pain in animals can be measured and these are based on a variety of phenomena that are related to either the perception of pain or alterations in physical or behavioural features of the animal that are caused by that pain. The features of pain that are most useful for assessment in clinical environments are not always the best to use in a research environment. This is because the aims and objectives of the two settings are different and so whilst particular techniques will have the same advantages and disadvantages in clinical and research environments, these considerations may become more or less of a drawback when moving from one environment to the other. For example, a simple descriptive pain scale has a number of advantages and disadvantages. In a clinical setting the advantages are very useful and the disadvantages are less relevant, but in a research environment the advantages are less important and the disadvantages can become more problematic. This paper will focus on pain in the research environment and after a brief revision of the pathophysiological systems involved will attempt to outline the major advantages and disadvantages of the more commonly used measurement techniques that have been used for studies in the area of pain perception and analgesia. This paper is expanded from a conference proceedings paper presented at the International Veterinary Emergency and Critical Care Conference in San Diego, USA.
Full Text Available Large-scale genome projects have generated a rapidly increasing number of DNA sequences. Therefore, development of computational methods to rapidly analyze these sequences is essential for progress in genomic research. Here we present an automatic annotation system for preliminary analysis of DNA sequences. The gene annotation tool (GATO is a Bioinformatics pipeline designed to facilitate routine functional annotation and easy access to annotated genes. It was designed in view of the frequent need of genomic researchers to access data pertaining to a common set of genes. In the GATO system, annotation is generated by querying some of the Web-accessible resources and the information is stored in a local database, which keeps a record of all previous annotation results. GATO may be accessed from everywhere through the internet or may be run locally if a large number of sequences are going to be annotated. It is implemented in PHP and Perl and may be run on any suitable Web server. Usually, installation and application of annotation systems require experience and are time consuming, but GATO is simple and practical, allowing anyone with basic skills in informatics to access it without any special training. GATO can be downloaded at [http://mariwork.iq.usp.br/gato/]. Minimum computer free space required is 2 MB.
Brose, Sascha; Danylyuk, Serhiy; Tempeler, Jenny; Kim, Hyun-su; Loosen, Peter; Juschkin, Larissa
In this work we present the capabilities of the designed and realized extreme ultraviolet laboratory exposure tool (EUVLET) which has been developed at the RWTH-Aachen, Chair for the Technology of Optical Systems (TOS), in cooperation with the Fraunhofer Institute for Laser Technology (ILT) and Bruker ASC GmbH. Main purpose of this laboratory setup is the direct application in research facilities and companies with small batch production, where the fabrication of high resolution periodic arrays over large areas is required. The setup can also be utilized for resist characterization and evaluation of its pre- and post-exposure processing. The tool utilizes a partially coherent discharge produced plasma (DPP) source and minimizes the number of other critical components to a transmission grating, the photoresist coated wafer and the positioning system for wafer and grating and utilizes the Talbot lithography approach. To identify the limits of this approach first each component is analyzed and optimized separately and relations between these components are identified. The EUV source has been optimized to achieve the best values for spatial and temporal coherence. Phase-shifting and amplitude transmission gratings have been fabricated and exposed. Several commercially available electron beam resists and one EUV resist have been characterized by open frame exposures to determine their contrast under EUV radiation. Cold development procedure has been performed to further increase the resist contrast. By analyzing the exposure results it can be demonstrated that only a 1:1 copy of the mask structure can be fully resolved by the utilization of amplitude masks. The utilized phase-shift masks offer higher 1st order diffraction efficiency and allow a demagnification of the mask structure in the achromatic Talbot plane.
Grace, Stephen C; Embry, Stephen; Luo, Heng
Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization options and supports non
[Sex- and gender-sensitive research in epidemiology and medicine: how can this be achieved? Aims and first results of the network "Sex-/Gender-Sensitive Research in Epidemiology, Neurosciences and Genetics/Cancer Research"].
Jahn, I; Gansefort, D; Kindler-Röhrborn, A; Pfleiderer, B
It is considered general knowledge among physicians and epidemiologists that biological and social aspects associated with being male or female have a strong influence on health and disease. Integrating these aspects into research is necessary to counteract the problems--including ethical problems--resulting from a different evidence basis for men and women. From January 2011 to June 2014 the Federal Ministry of Education and Research supported the network "Sex-/Gender-Sensitive Research in Epidemiology, Neuroscience and Genetics/Cancer Research" with three subprojects, which aimed to promote gender-sensitive research practices. The concepts and results are presented in this article. The subproject gathered data (literature analyses, questionnaires) and offered programs for young scientists. Experiences and results were collected and generalized, for instance, in the form of definitions of terms. 50 young scientists have taken part in the training program, identifying associations and barriers in sex-/gender-sensitive research. Among others, a working definition for "sex-/gender-sensitive research" was developed, as well as definitions for the terms "sex-specific" (for biological characteristics that are specific to men or women) and "sex-/gender-dependent" or "sex-/gender-associated" (for biological and social factors, for which the extent of occurrence differs between the sexes). The concepts realized by the network are well suited to stimulate further development and discussions. The definition of terms is an important base for a productive and high-yielding interdisciplinary collaboration.
Giles-Corti, Billie; Macaulay, Gus; Middleton, Nick; Boruff, Bryan; Bull, Fiona; Butterworth, Iain; Badland, Hannah; Mavoa, Suzanne; Roberts, Rebecca; Christian, Hayley
Growing evidence shows that higher-density, mixed-use, pedestrian-friendly neighbourhoods encourage active transport, including transport-related walking. Despite widespread recognition of the benefits of creating more walkable neighbourhoods, there remains a gap between the rhetoric of the need for walkability and the creation of walkable neighbourhoods. Moreover, there is little objective data to benchmark the walkability of neighbourhoods within and between Australian cities in order to monitor planning and design intervention progress and to assess built environment and urban policy interventions required to achieve increased walkability. This paper describes a demonstration project that aimed to develop, trial and validate a 'Walkability Index Tool' that could be used by policy makers and practitioners to assess the walkability of local areas; or by researchers to access geospatial data assessing walkability. The overall aim of the project was to develop an automated geospatial tool capable of creating walkability indices for neighbourhoods at user-specified scales. The tool is based on open-source software architecture, within the Australian Urban Research Infrastructure Network (AURIN) framework, and incorporates key sub-component spatial measures of walkability (street connectivity, density and land use mix). Using state-based data, we demonstrated it was possible to create an automated walkability index. However, due to the lack of availability of consistent of national data measuring land use mix, at this stage it has not been possible to create a national walkability measure. The next stage of the project is to increase useability of the tool within the AURIN portal and to explore options for alternative spatial data sources that will enable the development of a valid national walkability index. AURIN's open-source Walkability Index Tool is a first step in demonstrating the potential benefit of a tool that could measure walkability across Australia. It
Millar, A. Z.; Perry, S.
Interns in the Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) program conduct computer science research for the benefit of earthquake scientists and have created products in growing use within the SCEC education and research communities. SCEC/UseIT comprises some twenty undergraduates who combine their varied talents and academic backgrounds to achieve a Grand Challenge that is formulated around needs of SCEC scientists and educators and that reflects the value SCEC places on the integration of computer science and the geosciences. In meeting the challenge, students learn to work on multidisciplinary teams and to tackle complex problems with no guaranteed solutions. Meantime, their efforts bring fresh perspectives and insight to the professionals with whom they collaborate, and consistently produces innovative, useful tools for research and education. The 2007 Grand Challenge was to design and prototype serious games to communicate important earthquake science concepts. Interns broke themselves into four game teams, the Educational Game, the Training Game, the Mitigation Game and the Decision-Making Game, and created four diverse games with topics from elementary plate tectonics to earthquake risk mitigation, with intended players ranging from elementary students to city planners. The games were designed to be versatile, to accommodate variation in the knowledge base of the player; and extensible, to accommodate future additions. The games are played on a web browser or from within SCEC-VDO (Virtual Display of Objects). SCEC-VDO, also engineered by UseIT interns, is a 4D, interactive, visualization software that enables integration and exploration of datasets and models such as faults, earthquake hypocenters and ruptures, digital elevation models, satellite imagery, global isochrons, and earthquake prediction schemes. SCEC-VDO enables the user to create animated movies during a session, and is now part
Ion Danut I. JUGANARU
Full Text Available This study aims at analyzing the distribution of tourist flows in 2014, from 25 European countries, on three main categories of trip purposes, and assumes that there are differences or similarities between the tourists’ countries of residence and their trip purposes. "Purpose'' is a multidimensional concept used in marketing research, most often for understanding consumer behavior, and for identifying market segments or customer target groups, reunited in terms of similar characteristics. Being aware that the decision of choice/ purchase is based on purposes, their knowledge proves useful in designing strategies to increase the satisfaction level provided to the customer. The statistical method used in this paper is the factorial correspondences analysis. In our opinion, the identification, by this method, of the existence of differences or similarities between the tourists’ countries of residence and their trip purposes can represent a useful step in studying the tourism market and the choice/ reformulation of strategies.
Andreia Salvan Pagnan
Full Text Available Within the women's clothing of the universe's underwear were long an insignificant plan with regard to the development of new textile materials, shapes and colors. The panties that had been known as breeches or long underwear only became a necessity around the twentieth century with the vaporous dresses Christian Dior in the 50 Technological advances in the textile industry brought spandex created by the American laboratory DuPont's better known as the lycra. The elasticity of the fabric gave comfort to women's lingerie, passing this attribute to be considered as a quality factor in lingeries. To understand the desires of the users a qualitative research was conducted with women 18-45 years collecting opinions on the perceived comfort of already existing models compared to a new one be launched. Through the Quality Function Deployment Tool (QFD, or Quality Function Deployment, the data obtained from users of the answers given an interpretation which is to prioritize targets for the development of a based product on analyzes of desired characteristics which are converted into attributes technicians.
Mychasiuk, R; Benzies, K
Facebook is currently one of the world's most visited websites, and home to millions of users who access their accounts on a regular basis. Owing to the website's ease of accessibility and free service, demographic characteristics of users span all domains. As such, Facebook may be a valuable tool for locating and communicating with participants in longitudinal research studies. This article outlines the benefit gained in a longitudinal follow-up study, of an intervention programme for at-risk families, through the use of Facebook as a search engine. Using Facebook as a resource, we were able to locate 19 participants that were otherwise 'lost' to follow-up, decreasing attrition in our study by 16%. Additionally, analysis indicated that hard-to-reach participants located with Facebook differed significantly on measures of receptive language and self-esteem when compared to their easier-to-locate counterparts. These results suggest that Facebook is an effective means of improving participant retention in a longitudinal intervention study and may help improve study validity by reaching participants that contribute differing results. © 2011 Blackwell Publishing Ltd.
Analysis Tools NREL developed the following modeling, simulation, and analysis tools to investigate novel design goals (e.g., fuel economy versus performance) to find cost-competitive solutions. ADOPT Vehicle Simulator to analyze the performance and fuel economy of conventional and advanced light- and
Torous, John; Kiang, Mathew V; Lorme, Jeanette; Onnela, Jukka-Pekka
A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-quality data. Our aim is not the creation of yet another app per se but rather the establishment of a platform to collect research-quality smartphone raw sensor and usage pattern data. Our ultimate goal is to develop statistical, mathematical, and computational methodology to enable us and others to extract biomedical and clinical insights from smartphone data. We report on the development and early testing of Beiwe, a research platform featuring a study portal, smartphone app, database, and data modeling and analysis tools designed and developed specifically for transparent, customizable, and reproducible biomedical research use, in particular for the study of psychiatric and neurological disorders. We also outline a proposed study using the platform for patients with schizophrenia. We demonstrate the passive data capabilities of the Beiwe platform and early results of its analytical capabilities. Smartphone sensors and phone usage patterns, when coupled with appropriate statistical learning tools, are able to capture various social and behavioral manifestations of illnesses, in naturalistic settings, as lived and experienced by patients. The ubiquity of smartphones makes this type of moment-by-moment quantification of disease phenotypes highly scalable and, when integrated within a transparent research platform, presents tremendous opportunities for research, discovery, and patient health.
Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states，and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.
Full Text Available Aim/Purpose: These days educators are expected to integrate technological tools into classes. Although they acquire relevant skills, they are often reluctant to use these tools. Background:\tWe incorporated online forums for generating a Community of Inquiry (CoI in a faculty development program. Extending the Technology, Pedagogy, and Content Knowledge (TPACK model with Assessment Knowledge and content analysis of forum discourse and reflection after each CoI, we offer the Diagnostic Tool for Learning, Assessment, and Research (DTLAR. Methodology: This study spanned over two cycles of a development program for medical faculty. Contribution: This study demonstrates how the DTLAR supports in-depth examination of the benefits and challenges of using CoIs for learning and teaching. Findings: Before the program, participants had little experience with, and were reluctant to use, CoIs in classes. At the program completion, many were willing to adopt CoIs and appreciated this method’s contribution. Both CoIs discourse and reflections included positive attitudes regarding cognitive and teacher awareness categories. However, negative attitudes regarding affective aspects and time-consuming aspects of CoIs were exposed. Participants who experienced facilitating a CoI gained additional insights into its usefulness. Recommendations for Practitioners\t: The DTLAR allows analyzing adaption of online forums for learning and teaching. Recommendation for Researchers: The DTLAR allows analyzing factors that affect the acceptance of online fo-rums for learning and teaching. Impact on Society\t: While the tool was implemented in the context of medical education, it can be readily applied in other adult learning programs. Future Research: The study includes several design aspects that probably affected the improve-ment and challenges we found. Future research is called for providing guidelines for identifying boundary conditions and potential for further
Human rights education (HRE) aims to achieve a change of mindsets and social attitudes that entails the construction of a culture of respect towards those values it teaches. Although HRE is a recent field of study, its consolidation in Latin America is a fact. During the latest decades several authors have carried out research related to HRE that…
Rector, Travis A.; Vogt, Nicole P.
Spectroscopy is one of the most powerful tools that astronomers use to study the universe. However relatively few resources are available that enable undergraduates to explore astronomical spectra interactively. We present web-based applications which guide students through the analysis of real spectra of stars, galaxies, and quasars. The tools are written in HTML5 and function in all modern web browsers on computers and tablets. No software needs to be installed nor do any datasets need to be downloaded, enabling students to use the tools in or outside of class (e.g., for online classes).Approachable GUIs allow students to analyze spectra in the same manner as professional astronomers. The stellar spectroscopy tool can fit a continuum with a blackbody and identify spectral features, as well as fit line profiles and determine equivalent widths. The galaxy and AGN tools can also measure redshifts and calcium break strengths. The tools provide access to an archive of hundreds of spectra obtained with the optical telescopes at Kitt Peak National Observatory. It is also possible to load your own spectra or to query the Sloan Digital Sky Survey (SDSS) database.We have also developed curricula to investigate these topics: spectral classification, variable stars, redshift, and AGN classification. We will present the functionality of the tools and describe the associated curriculum. The tools are part of the General Education Astronomy Source (GEAS) project based at New Mexico State University, with support from the National Science Foundation (NSF, AST-0349155) and the National Aeronautics and Space Administration (NASA, NNX09AV36G). Curriculum development was supported by the NSF (DUE-0618849 and DUE-0920293).
The aim of this study is to investigate whether Lego could be used as a tool for reflective practice with social care practitioners (SCPs) and student practitioners. This article outlines an action research study conducted in an institute of higher education in Ireland. Findings from this study suggest that Lego can be used to support student…
M. M. Aligadjiev
Full Text Available Aim. The paper discusses the improvement of methods of hydrobiological studies by modifying tools for plankton and benthic samples collecting. Methods. In order to improve the standard methods of hydro-biological research, we have developed tools for sampling zooplankton and benthic environment of the Caspian Sea. Results. Long-term practice of selecting hydrobiological samples in the Caspian Sea shows that it is required to complete the modernization of the sampling tools used to collect hydrobiological material. With the introduction of Azov and Black Sea invasive comb jelly named Mnemiopsis leidyi A. Agassiz to the Caspian Sea there is a need to collect plankton samples without disturbing its integrity. Tools for collecting benthic fauna do not always give a complete picture of the state of benthic ecosystems because of the lack of visual site selection for sampling. Moreover, while sampling by dredge there is a probable loss of the samples, especially in areas with difficult terrain. Conclusion. We propose to modify a small model of Upstein net (applied in shallow water to collect zooplankton samples with an upper inverted cone that will significantly improve the catchability of the net in theCaspian Sea. Bottom sampler can be improved by installing a video camera for visual inspection of the bottom topography, and use sensors to determine tilt of the dredge and the position of the valves of the bucket.
Full Text Available In this paper, the test-bench for sonic logging tool is proposed and designed to realize automatic calibration and testing of the sonic logging tool. The test-bench System consists of Host Computer, Embedded Controlling Board, and functional boards. The Host Computer serves as the Human Machine Interface (HMI and processes uploaded data. The software running on Host Computer is designed on VC++, which is developed based on multithreading, Dynamic Linkable Library (DLL and Multiple Document Interface (MDI techniques. The Embedded Controlling Board uses ARM7 as the microcontroller and communicates with Host Computer via Ethernet. The Embedded Controlling Board software is realized based on embedded uclinux operating system with a layered architecture. The functional boards are designed based on Field Programmable Gate Array (FPGA and provide test interfaces for the logging tool. The functional board software is divided into independent sub-modules that can repeatedly be used by various functional boards and then integrated those sub-modules in the top layer. With the layered architecture and modularized design, the software system is highly reliable and extensible. With the help of designed system, a test has been conducted quickly and successfully on the electronic receiving cabin of the sonic logging tool. It demonstrated that the system could greatly improve the production efficiency of the sonic logging tool.
Bruun Larsen, Lars; Skonnord, Trygve; Gjelstad, Svein
in primary care research. Examples of this are online randomisation, electronic questionnaires, automatic email scheduling, mobile phone applications and data extraction tools. The amount of data can be increased to a low cost, and this can help to reach adequate sample sizes. However, there are still...... challenges within the field. To secure a high response rate, you need to follow up manually or use another application. There are also practical and ethical problems, and the data security for sensitive data have to be followed carefully. Session content Oral presentations about some technological...
Waterlander, Wilma E; Scarpa, Michael; Lentz, Daisy; Steenhuis, Ingrid H M
Economic interventions in the food environment are expected to effectively promote healthier food choices. However, before introducing them on a large scale, it is important to gain insight into the effectiveness of economic interventions and peoples' genuine reactions to price changes. Nonetheless, because of complex implementation issues, studies on price interventions are virtually non-existent. This is especially true for experiments undertaken in a retail setting. We have developed a research tool to study the effects of retail price interventions in a virtual-reality setting: the Virtual Supermarket. This paper aims to inform researchers about the features and utilization of this new software application. The Virtual Supermarket is a Dutch-developed three-dimensional software application in which study participants can shop in a manner comparable to a real supermarket. The tool can be used to study several food pricing and labelling strategies. The application base can be used to build future extensions and could be translated into, for example, an English-language version. The Virtual Supermarket contains a front-end which is seen by the participants, and a back-end that enables researchers to easily manipulate research conditions. The application keeps track of time spent shopping, number of products purchased, shopping budget, total expenditures and answers on configurable questionnaires. All data is digitally stored and automatically sent to a web server. A pilot study among Dutch consumers (n = 66) revealed that the application accurately collected and stored all data. Results from participant feedback revealed that 83% of the respondents considered the Virtual Supermarket easy to understand and 79% found that their virtual grocery purchases resembled their regular groceries. The Virtual Supermarket is an innovative research tool with a great potential to assist in gaining insight into food purchasing behaviour. The application can be obtained via an URL
Steenhuis Ingrid HM
Full Text Available Abstract Background Economic interventions in the food environment are expected to effectively promote healthier food choices. However, before introducing them on a large scale, it is important to gain insight into the effectiveness of economic interventions and peoples' genuine reactions to price changes. Nonetheless, because of complex implementation issues, studies on price interventions are virtually non-existent. This is especially true for experiments undertaken in a retail setting. We have developed a research tool to study the effects of retail price interventions in a virtual-reality setting: the Virtual Supermarket. This paper aims to inform researchers about the features and utilization of this new software application. Results The Virtual Supermarket is a Dutch-developed three-dimensional software application in which study participants can shop in a manner comparable to a real supermarket. The tool can be used to study several food pricing and labelling strategies. The application base can be used to build future extensions and could be translated into, for example, an English-language version. The Virtual Supermarket contains a front-end which is seen by the participants, and a back-end that enables researchers to easily manipulate research conditions. The application keeps track of time spent shopping, number of products purchased, shopping budget, total expenditures and answers on configurable questionnaires. All data is digitally stored and automatically sent to a web server. A pilot study among Dutch consumers (n = 66 revealed that the application accurately collected and stored all data. Results from participant feedback revealed that 83% of the respondents considered the Virtual Supermarket easy to understand and 79% found that their virtual grocery purchases resembled their regular groceries. Conclusions The Virtual Supermarket is an innovative research tool with a great potential to assist in gaining insight into food
Background Economic interventions in the food environment are expected to effectively promote healthier food choices. However, before introducing them on a large scale, it is important to gain insight into the effectiveness of economic interventions and peoples' genuine reactions to price changes. Nonetheless, because of complex implementation issues, studies on price interventions are virtually non-existent. This is especially true for experiments undertaken in a retail setting. We have developed a research tool to study the effects of retail price interventions in a virtual-reality setting: the Virtual Supermarket. This paper aims to inform researchers about the features and utilization of this new software application. Results The Virtual Supermarket is a Dutch-developed three-dimensional software application in which study participants can shop in a manner comparable to a real supermarket. The tool can be used to study several food pricing and labelling strategies. The application base can be used to build future extensions and could be translated into, for example, an English-language version. The Virtual Supermarket contains a front-end which is seen by the participants, and a back-end that enables researchers to easily manipulate research conditions. The application keeps track of time spent shopping, number of products purchased, shopping budget, total expenditures and answers on configurable questionnaires. All data is digitally stored and automatically sent to a web server. A pilot study among Dutch consumers (n = 66) revealed that the application accurately collected and stored all data. Results from participant feedback revealed that 83% of the respondents considered the Virtual Supermarket easy to understand and 79% found that their virtual grocery purchases resembled their regular groceries. Conclusions The Virtual Supermarket is an innovative research tool with a great potential to assist in gaining insight into food purchasing behaviour. The
Dec 29, 2008 ... It is on this premise that this article presents Bayes' theorem as a vital tool. A brief intuitive ... diseased individual will be selected or that a disease-free individual will be selected? ...... Ultrasound physics and. Instruction 3rd ed ...
Kirkpatrick, CJ; Otto, M; van Kooten, T; Krump, [No Value; Kriegsmann, J; Bittinger, F
Progress in biocompatibility and tissue engineering would today be inconceivable without the aid of in vitro techniques. Endothelial cell cultures represent a valuable tool not just in haemocompatibility testing, but also in the concept of designing hybrid organs. In the past endothelial cells (EC)
The present knowledge society requires statistical literacy-the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002). However, research shows that students generally do not gain satisfactory statistical understanding. The research
The types and uses of research reactors are reviewed. After an analysis of the world situation, the demand of new research reactors of about 20 MW is foreseen. The experience and competitiveness of INVAP S.E. as designer and constructor of research reactors is outlined and the general specifications of the reactors designed by INVAP for Egypt and Australia are given
Veeck, Ann; Hoger, Beth
Knowledge of how to effectively monitor social media is an increasingly valued marketing research skill. This study tests an approach for adding social media content to an undergraduate marketing research class team project. The revised project maintains the expected objectives and parameters of a traditional research project, while integrating…
At last, the first systematic guide to the growing jungle of citation indices and other bibliometric indicators. Written with the aim of providing a complete and unbiased overview of all available statistical measures for scientific productivity, the core of this reference is an alphabetical dictionary of indices and other algorithms used to evaluate the importance and impact of researchers and their institutions. In 150 major articles, the authors describe all indices in strictly mathematical terms without passing judgement on their relative merit. From widely used measures, such as the journal impact factor or the h-index, to highly specialized indices, all indicators currently in use in the sciences and humanities are described, and their application explained. The introductory section and the appendix contain a wealth of valuable supporting information on data sources, tools and techniques for bibliometric and scientometric analysis - for individual researchers as well as their funders and publishers.
Gelinas, Luke; Pierce, Robin; Winkler, Sabune; Cohen, I Glenn; Lynch, Holly Fernandez; Bierer, Barbara E
<正>The focus of the China Journal of Accounting Research is to publish theoretical and empirical research papers that use contemporary research methodologies to investigate issues about accounting,finance,auditing and corporate governance in China,the Greater China region and other emerging markets.The Journal also publishes insightful commentaries about China-related accounting research.The Journal encourages the application of economic and sociological theories to analyze and
<正>The focus of the China Journal of Accounting Research is to publish theoretical and empirical research papers that use contemporary research methodologies to investigate issues about accounting,finance,auditing and corporate governance in China,the Greater China region and other emerging markets.The Journal also publishes insightful commentaries about China-related accounting research.The Journal encourages the application of economic and sociological theories to analyze and explain accounting issues under Chinese capital markets accurately and succinctly.The published research articles of the Journal will enable scholars
Akl, Elie A; Fadlallah, Racha; Ghandour, Lilian; Kdouh, Ola; Langlois, Etienne; Lavis, John N; Schünemann, Holger; El-Jardali, Fadi
Groups or institutions funding or conducting systematic reviews in health policy and systems research (HPSR) should prioritise topics according to the needs of policymakers and stakeholders. The aim of this study was to develop and validate a tool to prioritise questions for systematic reviews in HPSR. We developed the tool following a four-step approach consisting of (1) the definition of the purpose and scope of tool, (2) item generation and reduction, (3) testing for content and face validity, (4) and pilot testing of the tool. The research team involved international experts in HPSR, systematic review methodology and tool development, led by the Center for Systematic Reviews on Health Policy and Systems Research (SPARK). We followed an inclusive approach in determining the final selection of items to allow customisation to the user's needs. The purpose of the SPARK tool was to prioritise questions in HPSR in order to address them in systematic reviews. In the item generation and reduction phase, an extensive literature search yielded 40 relevant articles, which were reviewed by the research team to create a preliminary list of 19 candidate items for inclusion in the tool. As part of testing for content and face validity, input from international experts led to the refining, changing, merging and addition of new items, and to organisation of the tool into two modules. Following pilot testing, we finalised the tool, with 22 items organised in two modules - the first module including 13 items to be rated by policymakers and stakeholders, and the second including 9 items to be rated by systematic review teams. Users can customise the tool to their needs, by omitting items that may not be applicable to their settings. We also developed a user manual that provides guidance on how to use the SPARK tool, along with signaling questions. We have developed and conducted initial validation of the SPARK tool to prioritise questions for systematic reviews in HPSR, along with
Rather than produce clear-cut answers to well-defined problems, research on future environmental policy issues requires a different approach whereby researchers are partners in joint learning processes among stakeholders, policy makers, NGOs (Non-Governmental Organisations) and industry. This
Abbott, Rodman P.; Stracener, Jerrell
This study investigates the relationship between the designated research project system independent variables of Labor, Travel, Equipment, and Contract total annual costs and the dependent variables of both the associated matching research project total annual academic publication output and thesis/dissertation number output. The Mahalanobis…
Noyons, Everard Christiaan Marie
Bibliometric maps of science are landscapes of scientific research fields created by quantitative analysis of bibliographic data. In such maps the 'cities' are, for instance, research topics. Topics with a strong cognitive relation are in each other's vicinity and topics with a weak relation are
Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox
Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...
Validity is a key concept in qualitative educational research. Yet, it is often not addressed in methodological writing about dance. This essay explores validity in a postmodern world of diverse approaches to scholarship, by looking at the changing face of validity in educational qualitative research and at how new understandings of the concept…
Roo, A.P.J. de; Thielen, J.; Feyen, L.; Burek, P.; Salamon, P.
The floods in the rivers Meuse and Rhine in 1993 and 1995 made the European Commission realize that also at Commission level further research on floods – especially in transboundary river catchments - was necessary. This led to the start of a dedicated research project on floods at the European
Full Text Available Buildings need to be more environmentally benign since the building sector is responsible for about 40% of all of energy and material use in Sweden. For this reason a unique cooperation between companies, municipalities and the Government called “Building- Living and Property Management for the future”, in short “The Building Living Dialogue” has going on since 2003. The project focuses on: a healthy indoor environment, b efficient use of energy, and c efficient resource management. In accordance with the dialogue targets, two research projects were initiated aiming at developing an Environmental rating tool taking into accounts both building sector requirements and expectations and national and international research findings. This paper describes the first phase in the development work where stakeholders and researchers cooperate. It includes results from inventories and based on this experience discusses procedures for developing assessment tools and what the desirable features of a broadly accepted building rating tool could be.
<正>The focus of the China Journal of Accounting Research is to publish theoretical and empirical research papers that use contemporary research methodologies to investigate issues about accounting,finance,auditing and corporate governance in China,the Greater China region and other emerging markets.The Journal also publishes insightful commentaries about China-related accounting research.The Journal encourages the application of economic and sociological theories to analyze and explain accounting issues within the legal and institutional framework of China,and
The objective of this study is to develop an evidencebased research implementation database and tool to support research implementation at the Georgia Department of Transportation (GDOT).A review was conducted drawing from the (1) implementati...
THE FOLLOWING IS ONE OF A SERIES OF PAPERS DEVELOPED OR PRODUCED BY THE ECONOMIC ANALYSIS DIVISION OF THE JOHN A. VOLPE NATIONAL TRANSPORTATION SYSTEMS CENTER AS PART OF ITS RESEARCH PROJECT LOOKING INTO ISSUES SURROUNDING : USER RESPONSE AND MARKET ...
Ivancic, William D.
Personnel in the NASA Glenn Research Center Network and Architectures branch have performed a variety of research related to space-based sensor webs, network centric operations, security and delay tolerant networking (DTN). Quality documentation and communications, real-time monitoring and information dissemination are critical in order to perform quality research while maintaining low cost and utilizing multiple remote systems. This has been accomplished using a variety of Internet technologies often operating simultaneously. This paper describes important features of various technologies and provides a number of real-world examples of how combining Internet technologies can enable a virtual team to act efficiently as one unit to perform advanced research in operational systems. Finally, real and potential abuses of power and manipulation of information and information access is addressed.
Klein, P. D.; Hachey, D. L.; Kreek, M. J.; Schoeller, D. A.
Recent developments in the use of the stable isotopes, /sup 13/C, /sup 15/N, /sup 17/O, and /sup 18/O, as tracers in research studies in the fields of biology, medicine, pharmacology, and agriculture are briefly reviewed. (CH)
Massi, Luciana; Santos, Gelson Ribeiro dos; Ferreira, Jerino Queiroz; Queiroz, Salete Linhares
Chemistry teachers increasingly use research articles in their undergraduate courses. This trend arises from current pedagogical emphasis on active learning and scientific process. In this paper, we describe some educational experiences on the use of research articles in chemistry higher education. Additionally, we present our own conclusions on the use of such methodology applied to a scientific communication course offered to undergraduate chemistry students at the University of São Paulo, ...
Katz, Daniel S [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA); Jha, Shantenu [Rutgers Univ., New Brunswick, NJ (United States); Weissman, Jon [Univ. of Minnesota, Minneapolis, MN (United States); Turilli, Matteo [Rutgers Univ., New Brunswick, NJ (United States)
This is the final technical report for the AIMES project. Many important advances in science and engineering are due to large-scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). This is evidenced by missing design principles for DCI, and an absence of generally acceptable and usable distributed computing abstractions. The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scientific distributed applications. AIMES laid the foundations to address the tripartite challenge of dynamic resource management, integrating information, and portable and interoperable distributed applications. Four abstractions were defined and implemented: skeleton, resource bundle, pilot, and execution strategy. The four abstractions were implemented into software modules and then aggregated into the AIMES middleware. This middleware successfully integrates information across the application layer (skeletons) and resource layer (Bundles), derives a suitable execution strategy for the given skeleton and enacts its execution by means of pilots on one or more resources, depending on the application requirements, and resource availabilities and capabilities.
Madiedo, J. M.
The Virtual Museum for Meteorites (Figure 1) was created as a tool for students, educators and researchers [1, 2]. One of the aims of this online resource is to promote the interest in meteorites. Thus, the role of meteorites in education and outreach is fundamental, as these are very valuable tools to promote the public's interest in Astronomy and Planetary Sciences. Meteorite exhibitions reveal the fascination of students, educators and even researchers for these extraterrestrial rocks and how these can explain many key questions origin and evolution of our Solar System. However, despite the efforts related to the origin and evolution of our Solar System. However, despite the efforts of private collectors, museums and other institutions to organize meteorite exhibitions, the reach of these is usually limited. The Virtual Museum for Meteorites takes advantage of HTML and related technologies to overcome local boundaries and offer its contents for a global audience. A description of the recent developments performed in the framework of this virtual museum is given in this work.
Full Text Available Innovation and thus the production of knowledge becomes a factor of competitiveness. In this context quality management could be complemented by knowledge management to aim the improvement of knowledge production by research activities process. To this end, after describing knowledge and informa-tion typologies in engineering activities, a knowledge man-agement system is proposed. The goal is to support: (1 Semi-Structured Information (e.g. reports, etc. thanks to the BASIC-Lab tool functions, which are based on attributing points of view and annotations to documents and document zones, and (2 Non-Structured Information (such as mail, dialogues, etc., thanks to MICA-Graph approach which intends to support ex-change of technical messages that concerns common resolution of research problems within project teams and to capitalise relevant knowledge. For the both approaches, prototype tools have been developed and evaluated, primarily to feed back with manufacturing knowledge in the EADS industrial envi-ronment.
Hakim, Toufic M.; Garg, Shila
The National Science Foundation's 1996 report "Shaping the Future: New Expectations for Undergraduate Education in Science, Mathematics, Engineering and Technology" urged that in order to improve SME&T education, decisive action must be taken so that "all students have access to excellent undergraduate education in science .... and all students learn these subjects by direct experience with the methods and processes of inquiry." Research-related educational activities that integrate education and research have been shown to be valuable in improving the quality of education and enhancing the number of majors in physics departments. Student researchers develop a motivation to continue in science and engineering through an appreciation of how science is done and the excitement of doing frontier research. We will address some of the challenges of integrating research into the physics undergraduate curriculum effectively. The departmental and institutional policies and infrastructure required to help prepare students for this endeavor will be discussed as well as sources of support and the establishment of appropriate evaluation procedures.
Otrel-Cass, Kathrin; Cowie, Bronwen
When practising teachers take time to exchange their experiences and reflect on their teaching realities as critical friends, they add meaning and depth to educational research. When peer talk is facilitated through video chat platforms, teachers can meet (virtually) face to face even when...... recordings were transcribed and used to prompt further discussion. The recording of the video chat meetings provided an opportunity for researchers to listen in and follow up on points they felt needed further unpacking or clarification. The recorded peer video chat conversations provided an additional...... opportunity to stimulate and support teacher participants in a process of critical analysis and reflection on practice. The discussions themselves were empowering because in the absence of the researcher, the teachers, in negotiation with peers, choose what is important enough to them to take time to discuss....
Ramalho, A.J.G.; Marques, J.G.; Cardeira, F.M.
A short presentation is made of the Portuguese Research Reactor utilisation, its problems and the solutions found. Starting with the initial calibration and experiments the routine operation at full power follows. The problems then encountered which drove to the refurbishment are referred. The present status of the system is then presented and from that conclusions for the future are derived. (author)
Nathalie Sonck; Henk Fernee
Smartphones and apps offer an innovative means of collecting data from the public. The Netherlands Institute for Social Research | SCP has been engaged in one of the first experiments involving the use of a smartphone app to collect time use data recorded by means of an electronic diary. Is it
Ainley, Mary; Bourke, Valerie; Chatfield, Robert; Hillman, Kylie; Watkins, Ian
In 1997, Balwyn High School (Australia) instituted a class of 28 Year 7 students to use laptop computers across the curriculum. This report details findings from an action research project that monitored important aspects of what happened when this program was introduced. A range of measures was developed to assess the influence of the use of…
Brownell, Marni D.; Jutte, Douglas P.
Linking administrative data records for the same individuals across services and over time offers a powerful, population-wide resource for child maltreatment research that can be used to identify risk and protective factors and to examine outcomes. Multistage de-identification processes have been developed to protect privacy and maintain…
Baran, Evrim; Chuang, Hsueh-Hua; Thompson, Ann
TPACK (technological pedagogical content knowledge) has emerged as a clear and useful construct for researchers working to understand technology integration in learning and teaching. Whereas first generation TPACK work focused upon explaining and interpreting the construct, TPACK has now entered a second generation where the focus is upon using…
Netjes, M.; Vanderfeesten, I.T.P.; Reijers, H.A.; Bussler, C.; Haller, A.
Although much attention is being paid to business processes during the past decades, the design of business processes and particularly workflow processes is still more art than science. In this workshop paper, we present our view on modeling methods for workflow processes and introduce our research
Willis, Danny G; Grace, Pamela J
This article uses an exemplar of phenomenological research of middle school boys, experiences of being bullied as applied philosophy and science to illuminate the intersection of the moral and scientific realms for theory-oriented research and practice. As a consequence, a clear foundation for advancing nursing science and envisioning innovative nursing practice with boys who experience being bullied is provided. Included is a weaving together of phenomenological perspective for research and practice, Roger's (nursing) Science of Unitary Human Beings (SUHB), and SUHB-derived middle range theories of self-transcendence and power.
Yu, T.; Lu, R.; Bishop, L.
Biofilm processes are widely utilized in environmental engineering for biodegradation of contaminated waters, gases and soils. It is important to understand the structure and functions of biofilms. Microelectrodes are novel experimental tools for environmental biofilm studies. The authors reviewed the techniques of oxygen, sulfide, redox potential and pH microelectrode. These microelectrodes have tip diameters of 3 to 20 μm, resulting a high spatial resolution. They enable us directly measure the chemical conditions as results of microbial activities in biofilms. The authors also reported the laboratory and field studies of wastewater biofilms using microelectrode techniques. The results of these studies provided experimental evidence on the stratification of microbial processes and the associated redox potential change in wastewater biofilms: (1) The oxygen penetration depth was only a fraction of the biofilm thickness. This observation, first made under laboratory conditions, has been confirmed under field conditions. (2) The biofilms with both aerobic oxidation and sulfate reduction had a clearly stratified structure. This was evidenced by a sharp decrease of redox potential near the interface between the aerobic zone and the sulfate reduction zone within the biofilm. In this type of biofilms, aerobic oxidation took place only in a shallow layer near the biofilm surface and sulfate reduction occurred in the deeper anoxic zone. (3) The redox potential changed with the shift of primary microbial process in biofilms, indicating that it is possible to use redox potential to help illustrate the structure and functions of biofilms. (author)
Full Text Available Human pluripotent stem cells (hPSCs, namely, embryonic stem cells (ESCs and induced pluripotent stem cells (iPSCs, with their ability of indefinite self-renewal and capability to differentiate into cell types derivatives of all three germ layers, represent a powerful research tool in developmental biology, for drug screening, disease modelling, and potentially cell replacement therapy. Efficient differentiation protocols that would result in the cell type of our interest are needed for maximal exploitation of these cells. In the present work, we aim at focusing on the protocols for differentiation of hPSCs into functional cardiomyocytes in vitro as well as achievements in the heart disease modelling and drug testing on the patient-specific iPSC-derived cardiomyocytes (iPSC-CMs.
Open inquiry through reproducing results is fundamental to the scientific process. Contemporary research relies on software engineering pipelines to collect, process, and analyze data. The open source projects within Project Jupyter facilitate these objectives by bringing software engineering within the context of scientific communication. We will highlight specific projects that are computational building blocks for scientific communication, starting with the Jupyter Notebook. We will also explore applications of projects that build off of the Notebook such as Binder, JupyterHub, and repo2docker. We will discuss how these projects can individually and jointly improve reproducibility in scientific communication. Finally, we will demonstrate applications of Jupyter software that allow researchers to build upon the code of other scientists, both to extend their work and the work of others. There will be a follow-up demo session in the afternoon, hosted by iML. Details can be foun...
Wright, J; Wagner, A
Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...
Blažková, Michaela; Člověčko, M.; Eltsov, V. B.; Gažo, E.; de Graaf, R.; Hosio, J.J.; Krusius, M.; Schmoranzer, D.; Schoepe, W.; Skrbek, Ladislav; Skyba, P.; Solntsev, R.E.; Vinen, W. F.
Roč. 150, - (2008), s. 525-535 ISSN 0022-2291 R&D Projects: GA ČR GA202/05/0218 Grant - others:GAUK(CZ) 7953/2007; Transnational Access Programme(XE) RITA -CT-2003-505313 Institutional research plan: CEZ:AV0Z10100520 Keywords : normal 3He * superfluid 3He * superfluid 4He * turbulence, * cavitation * quartz tuning fork Subject RIV: BK - Fluid Dynamics Impact factor: 1.034, year: 2008
An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between 'liquid' electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided
Jha, Shantenu [Rutgers Univ., New Brunswick, NJ (United States)
Many important advances in science and engineering are due to large-scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scienti c distributed applications . The survey established, arguably for the rst time, the relationship between infrastructure and scienti c distributed applications. It examined well known contributors to the complexity associated with infrastructure, such as inconsistent internal and external interfaces, and demonstrated the correlation with application brittleness. It discussed how infrastructure complexity reinforces the challenges inherent in developing distributed applications.
Science and society would be well advised to develop a different relationship as the information revolution penetrates all aspects of modern life. Rather than produce clear answers to clear questions in a top-down manner, land-use issues related to the UN Sustainable Development Goals (SDGs) present "wicked"problems involving different, strongly opiniated, stakeholders with conflicting ideas and interests and risk-averse politicians. The Dutch government has invited its citizens to develop a "science agenda", defining future research needs, implicitly suggesting that the research community is unable to do so. Time, therefore, for a pro-active approach to more convincingly define our:"societal license to research". For soil science this could imply a focus on the SDGs , considering soils as living, characteristically different, dynamic bodies in a landscape, to be mapped in ways that allow generation of suitable modelling data. Models allow a dynamic characterization of water- and nutrient regimes and plant growth in soils both for actual and future conditions, reflecting e.g. effects of climate or land-use change or alternative management practices. Engaging modern stakeholders in a bottom-up manner implies continuous involvement and "joint learning" from project initiation to completion, where modelling results act as building blocks to explore alternative scenarios. Modern techniques allow very rapid calculations and innovative visualization. Everything is possible but only modelling can articulate the economic, social and environmental consequences of each scenario, demonstrating in a pro-active manner the crucial and indispensible role of research. But choices are to be made by stakeholders and reluctant policy makers and certainly not by scientists who should carefully guard their independance. Only clear results in the end are convincing proof for the impact of science, requiring therefore continued involvement of scientists up to the very end of projects. To
Hege, Inga; Kononowicz, Andrzej A; Adler, Martin
Clinical reasoning is a fundamental process medical students have to learn during and after medical school. Virtual patients (VP) are a technology-enhanced learning method to teach clinical reasoning. However, VP systems do not exploit their full potential concerning the clinical reasoning process; for example, most systems focus on the outcome and less on the process of clinical reasoning. Keeping our concept grounded in a former qualitative study, we aimed to design and implement a tool to enhance VPs with activities and feedback, which specifically foster the acquisition of clinical reasoning skills. We designed the tool by translating elements of a conceptual clinical reasoning learning framework into software requirements. The resulting clinical reasoning tool enables learners to build their patient's illness script as a concept map when they are working on a VP scenario. The student's map is compared with the experts' reasoning at each stage of the VP, which is technically enabled by using Medical Subject Headings, which is a comprehensive controlled vocabulary published by the US National Library of Medicine. The tool is implemented using Web technologies, has an open architecture that enables its integration into various systems through an open application program interface, and is available under a Massachusetts Institute of Technology license. We conducted usability tests following a think-aloud protocol and a pilot field study with maps created by 64 medical students. The results show that learners interact with the tool but create less nodes and connections in the concept map than an expert. Further research and usability tests are required to analyze the reasons. The presented tool is a versatile, systematically developed software component that specifically supports the clinical reasoning skills acquisition. It can be plugged into VP systems or used as stand-alone software in other teaching scenarios. The modular design allows an extension with new
Curtis, Helen J; Goldacre, Ben
We aimed to compile and normalise England's national prescribing data for 1998-2016 to facilitate research on long-term time trends and create an open-data exploration tool for wider use. We compiled data from each individual year's national statistical publications and normalised them by mapping each drug to its current classification within the national formulary where possible. We created a freely accessible, interactive web tool to allow anyone to interact with the processed data. We downloaded all available annual prescription cost analysis datasets, which include cost and quantity for all prescription items dispensed in the community in England. Medical devices and appliances were excluded. We measured the extent of normalisation of data and aimed to produce a functioning accessible analysis tool. All data were imported successfully. 87.5% of drugs were matched exactly on name to the current formulary and a further 6.5% to similar drug names. All drugs in core clinical chapters were reconciled to their current location in the data schema, with only 1.26% of drugs not assigned a current chemical code. We created an openly accessible interactive tool to facilitate wider use of these data. Publicly available data can be made accessible through interactive online tools to help researchers and policy-makers explore time trends in prescribing. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
The purpose of this review, being executed for the Minister of the Interior of the German Federal Republic is to inform all parties involved in the licensing procedure as well as the consulting councils on the newest nuclear safety research results and the status of their verification in a precise, short manner. In addition experts opinions are given with regard to the relevance of these research results to nuclear rules and guidelines as well as to the execution of the Atomic Law. Each report is a short evaluation of a final research report. These evaluations are executed by specialists, who are acquainted with the technical aspects of the licensing procedure of nuclear power plants in the German Federal Republic. (orig.) [de
Quanjel, Tessa C C; Spreeuwenberg, Marieke D; Struijs, Jeroen N; Baan, Caroline A; Ruwaard, Dirk
In an attempt to deal with the pressures on the health-care system and to guarantee sustainability, changes are needed. This study focuses on a cardiology primary care plus intervention. Primary care plus (PC+) is a new health-care delivery model focused on substitution of specialist care in the hospital setting with specialist care in the primary care setting. The intervention consists of a cardiology PC+ centre in which cardiologists, supported by other health-care professionals, provide consultations in a primary care setting. The PC+ centre aims to improve the health of the population and quality of care as experienced by patients, and reduce the number of referrals to hospital-based outpatient specialist care in order to reduce health-care costs. These aims reflect the Triple Aim principle. Hence, the objectives of the study are to evaluate the cardiology PC+ centre in terms of the Triple Aim outcomes and to evaluate the process of the introduction of PC+. The study is a practice-based, quantitative study with a longitudinal observational design, and an additional qualitative study to supplement, interpret and improve the quantitative study. The study population of the quantitative part will consist of adult patients (≥18 years) with non-acute and low-complexity cardiology-related health complaints, who will be referred to the cardiology PC+ centre (intervention group) or hospital-based outpatient cardiology care (control group). All eligible patients will be asked to complete questionnaires at three different time points consisting of questions about their demographics, health status and experience of care. Additionally, quantitative data will be collected about health-care utilization and related health-care costs at the PC+ centre and the hospital. The qualitative part, consisting of semi-structured interviews, focus groups, and observations, is designed to evaluate the process as well as to amplify, clarify and explain quantitative results. This study
This slide presentation reviews the Global Hawk, a unmanned aerial vehicle (UAV) that NASA plans to use for Earth Sciences research. The Global Hawk is the world's first fully autonomous high-altitude, long-endurance aircraft, and is capable of conducting long duration missions. Plans are being made for the use of the aircraft on missions in the Arctic, Pacific and Western Atlantic Oceans. There are slides showing the Global Hawk Operations Center (GHOC), Flight Control and Air Traffic Control Communications Architecture, and Payload Integration and Accommodations on the Global Hawk. The first science campaign, planned for a study of the Pacific Ocean, is reviewed.
Lal, Shalini; Donnelly, Catherine; Shin, Jennifer
Digital storytelling is a method of using storytelling, group work, and modern technology to facilitate the creation of 2-3 minute multi-media video clips to convey personal or community stories. Digital storytelling is being used within the health care field; however, there has been limited documentation of its application within occupational therapy. This paper introduces digital storytelling and proposes how it can be applied in occupational therapy clinical practice, education, and research. The ethical and methodological challenges in relation to using the method are also discussed.
Barr, Yael; Rasbury, Jack; Johnson, Jordan; Barstend, Kristina; Saile, Lynn; Watkins, Sharmi
The Exploration Medical Capability (ExMC) element is one of six elements of the Human Research Program (HRP). ExMC is charged with decreasing the risk of: "Inability to adequately recognize or treat an ill or injured crew member" for exploration-class missions In preparation for exploration-class missions, ExMC has compiled a large evidence base, previously available only to persons within the NASA community. ExMC has developed the "NASA Human Research Wiki" in an effort to make the ExMC information available to the general public and increase collaboration within and outside of NASA. The ExMC evidence base is comprised of several types of data, including: (1)Information on more than 80 medical conditions which could occur during space flight (a)Derived from several sources (b)Including data on incidence and potential outcomes, as captured in the Integrated Medical Model s (IMM) Clinical Finding Forms (CliFFs). (2)Approximately 25 gap reports (a)Identify any "gaps" in knowledge and/or technology that would need to be addressed in order to provide adequate medical support for these novel missions.
Full Text Available Alexander M Walker,1 Amanda R Patrick,2 Michael S Lauer,3 Mark C Hornbrook,4 Matthew G Marin,5 Richard Platt,6 Véronique L Roger,7 Paul Stang,8 Sebastian Schneeweiss21World Health Information Science Consultants, Newton, MA; 2Division of Pharmacoepidemiology and Pharmacoeconomics, Brigham and Women's Hospital, Boston, MA; 3National Heart, Lung, and Blood Institute, National Institutes of Health, Bethesda, MD; 4The Center for Health Research, Kaiser Permanente Northwest, Portland, OR; 5Department of Medicine, New Jersey Medical School, Newark, NJ; 6Department of Population Medicine, Harvard Pilgrim Health Care Institute and Harvard Medical School, Boston, MA; 7Department of Health Sciences Research, Mayo Clinic, Rochester, MN; 8Johnson and Johnson Pharmaceutical Research and Development, Titusville, NJ, USABackground: Comparative effectiveness research (CER provides actionable information for health care decision-making. Randomized clinical trials cannot provide the patients, time horizons, or practice settings needed for all required CER. The need for comparative assessments and the infeasibility of conducting randomized clinical trials in all relevant areas is leading researchers and policy makers to non-randomized, retrospective CER. Such studies are possible when rich data exist on large populations receiving alternative therapies that are used as-if interchangeably in clinical practice. This setting we call “empirical equipoise.”Objectives: This study sought to provide a method for the systematic identification of settings it in which it is empirical equipoise that offers promised non-randomized CER.Methods: We used a standardizing transformation of the propensity score called “preference” to assess pairs of common treatments for uncomplicated community-acquired pneumonia and new-onset heart failure in a population of low-income elderly people in Pennsylvania, for whom we had access to de-identified insurance records. Treatment
Offersen, Sara Marie Hebsgaard
that the Danes are encouraged to be alert to still earlier and vaguer bodily signs of potential cancer and seek care ‘in time’. With biomedical constructions such as ‘cancer awareness’ and ‘alarm symptoms of cancer’ and the retrospectively oriented definition of life before symptoms-based healthcare seeking...... and articulation of bodily sensations, and how decisions about healthcare seeking are established in this context. This dissertation aims to explore these matters from the perspective of the Danish middle class, mainly focusing on how sensations are ascribed meaning as symptoms and how they are evoked...... on a continuum between what is locally considered ordinary and extraordinary. Overall, the dissertation argues that inquiries into morality and potentiality provide valuable insights into healthcare seeking practices and the making and management of symptoms in everyday life. The dissertation is based on 18...
Everett-Murphy, K.; De Villiers, A.; Ketterer, E.; Steyn, K.
As part of a comprehensive programme to prevent non-communicable disease in South Africa, there is a need to develop public education campaigns on healthy eating. Urban populations of lower socioeconomic status are a priority target population. This study involved formative research to guide the development of a nutrition resource appropriate to…
Scanlon, J.J.; Rolader, G.E.; Jamison, K.A.; Petresky, H.
Electromagnetic Launcher (EML) research at the Air Force Armament Laboratory, Hypervelocity Launcher Branch (AFATL/SAH), Eglin AFB, has focused on developing the technologies required for repetitively launching several kilogram payloads to high velocities. Previous AFATL/SAH experiments have been limited by the available power supply resulting in small muzzle energies on the order of 100's of kJ. In an effort to advance the development of EML's, AFATL/SAH has designed and constructed a battery power supply (BPS) capable of providing several mega-Amperes of current for several seconds. This system consists of six modules each containing 2288 automotive batteries which may be connected in two different series - parallel arrangements. In this paper the authors define the electrical characteristics of the AFATL Battery Power supply at the component level
Rogers, Jan; SanSoucie, Mike
Containerless processing represents an important topic for materials research in microgravity. Levitated specimens are free from contact with a container, which permits studies of deeply undercooled melts, and high-temperature, highly reactive materials. Containerless processing provides data for studies of thermophysical properties, phase equilibria, metastable state formation, microstructure formation, undercooling, and nucleation. The European Space Agency (ESA) and the German Aerospace Center (DLR) jointly developed an electromagnetic levitator facility (MSL-EML) for containerless materials processing in space. The electrostatic levitator (ESL) facility at the Marshall Space Flight Center provides support for the development of containerless processing studies for the ISS. Apparatus and techniques have been developed to use the ESL to provide data for phase diagram determination, creep resistance, emissivity, specific heat, density/thermal expansion, viscosity, surface tension and triggered nucleation of melts. The capabilities and results from selected ESL-based characterization studies performed at NASA's Marshall Space Flight Center will be presented.
D R Simmons
Full Text Available A common problem in visual appearance research is how to quantitatively characterise the visual appearance of a region of an image which is categorised by human observers in the same way. An example of this is scarring in medical images (Ayoub et al, 2010, The Cleft-Palate Craniofacial Journal, in press. We have argued that “scarriness” is itself a visual appearance descriptor which summarises the distinctive combination of colour, texture and shape information which allows us to distinguish scarred from non-scarred tissue (Simmons et al, ECVP 2009. Other potential descriptors for other image classes would be “metallic”, “natural”, or “liquid”. Having developed an automatic algorithm to locate scars in medical images, we then tested “ground truth” by asking untrained observers to draw around the region of scarring. The shape and size of the scar on the image was defined by building a contour plot of the agreement between observers' outlines and thresholding at the point above which 50% of the observers agreed: a consensus coding scheme. Based on the variability in the amount of overlap between the scar as defined by the algorithm, and the consensus scar of the observers, we have concluded that the algorithm does not completely capture the putative appearance descriptor “scarriness”. A simultaneous analysis of qualitative descriptions of the scarring by the observers revealed that other image features than those encoded by the algorithm (colour and texture might be important, such as scar boundary shape. This approach to visual appearance research in medical imaging has potential applications in other application areas, such as botany, geology and archaeology.
Bronsema, B. [Afdeling Architectural Engineering en Technology, Faculteit Bouwkunde, Technische Universiteit Delft TUD, Delft (Netherlands)
The Earth, Wind and Fire concept transforms a building into a 'climate machine' which is powered by the natural forces and energy of the sun, wind, the mass of the earth and gravity. This concept consists of a Climate Cascade, a solar chimney and a Ventec roof, which have been tested in physical mock-ups. Simulation models have been validated on the basis of real measurements. This work has resulted in the creation of reliable tools for design practice [Dutch] Het Earth, Wind en Fire-concept voor natuurlijke airconditioning biedt meer zekerheid voor het realiseren van energieneutrale kantoorgebouwen dan mogelijk zou zijn door verbetering van bestaande technieken. Het concept maakt gebruik van de omgevingsenergie van aardmassa, wind en zon. Enerzijds wordt deze energie passief gebruikt voor het realiseren van een natuurlijke airconditioning, waarbij de gewenste luchtstromingen tot stand komen onder invloed van thermisch gedreven drukverschillen. Anderzijds worden zon en wind benut voor actieve energieopwekking, waardoor een gebouw in principe energieneutraal kan worden. Een dergelijk gebouw kan worden beschouwd als 'klimaatmachine', geactiveerd door zwaartekracht, wind en zon.
Fumagalli, E.; Verdelli, G.
ISMES (Experimental Institute for Models and Structures) is now carrying out a series of tests on physical models as a part of a research programme sponsored by DSR (Studies and Research Direction) of ENEL (Italian State Electricity Board) on behalf of CPN (Nuclear Design and Construction Centre) of ENEL with the aim to experience a 'Thin'-walled PCPV for 'BWR'. The physical model, together with the mathematical model and the rheological model of the materials, is intended as a meaningful design tool. The mathematical model covers the overall structural design phase, (geometries) and the linear behaviour, whereas the physical model, besides of a global information to be compared with the results of the mathematical model, supplies a number of data as the non-linear behaviour up to failure and local conditions (penetration area etc.) are concerned. The aim of the first phase of this research programme is to make a comparison between the calculation and experiment tests as the thicknesses of the wall and the bottom slab are concerned, whereas the second phase of the research deals with the behaviour of the removable lid and its connection with the main structure. To do this, a model in scale 1:10 has been designed which symmetrically reproduces with respect to the equator, the bottom part of the structure. In the bottom slab the penetrations of the prototype design are reproduced, whereas the upper slab is plain. This paper describes the model, and illustrates the main results, underlining the different behaviour of the upper and bottom slabs up to collapse
Verma, Ark; Brysbaert, Marc
Neuropsychological and neuroimaging research has established that knowledge related to tool use and tool recognition is lateralized to the left cerebral hemisphere. Recently, behavioural studies with the visual half-field technique have confirmed the lateralization. A limitation of this research was that different sets of stimuli had to be used for the comparison of tools to other objects and objects to non-objects. Therefore, we developed a new set of stimuli containing matched triplets of tools, other objects and non-objects. With the new stimulus set, we successfully replicated the findings of no visual field advantage for objects in an object recognition task combined with a significant right visual field advantage for tools in a tool recognition task. The set of stimuli is available as supplemental data to this article.
Snilstveit, Birte; Vojtkova, Martina; Bhavsar, Ami; Stevenson, Jennifer; Gaarder, Marie
A range of organizations are engaged in the production of evidence on the effects of health, social, and economic development programs on human welfare outcomes. However, evidence is often scattered around different databases, web sites, and the gray literature and is often presented in inaccessible formats. Lack of overview of the evidence in a specific field can be a barrier to the use of existing research and prevent efficient use of limited resources for new research. Evidence & Gap Maps (EGMs) aim to address these issues and complement existing synthesis and mapping approaches. EGMs are a new addition to the tools available to support evidence-informed policymaking. To provide an accessible resource for researchers, commissioners, and decision makers, EGMs provide thematic collections of evidence structured around a framework which schematically represents the types of interventions and outcomes of relevance to a particular sector. By mapping the existing evidence using this framework, EGMs provide a visual overview of what we know and do not know about the effects of different programs. They make existing evidence available, and by providing links to user-friendly summaries of relevant studies, EGMs can facilitate the use of existing evidence for decision making. They identify key "gaps" where little or no evidence from impact evaluations and systematic reviews is available and can be a valuable resource to inform a strategic approach to building the evidence base in a particular sector. The article will introduce readers to the concept and methods of EGMs and present a demonstration of the EGM tool using existing examples. Copyright Â© 2016 Elsevier Inc. All rights reserved.
Because the World Wide Web is a dynamic collection of information, the Web search tools (or "search engines") that index the Web are dynamic. Traditional information retrieval evaluation techniques may not provide reliable results when applied to the Web search tools. This study is the result of ten replications of the classic 1996 Ding and Marchionini Web search tool research. It explores the effects that replication can have on transforming unreliable results from one iteration into replica...
Zhongqi Sheng; Lei Zhang; Hualong Xie; Changchun Liu
Assembly is the part that produces the maximum workload and consumed time during product design and manufacturing process. CNC machine tool is the key basic equipment in manufacturing industry and research on assembly design technologies of CNC machine tool has theoretical significance and practical value. This study established a simplified ASRG for CNC machine tool. The connection between parts, semantic information of transmission, and geometric constraint information were quantified to as...
Grigoriev, S. N.; Bobrovskij, N. M.; Melnikov, P. A.; Bobrovskij, I. N.
Modern vector of development of machining technologies aimed at the transition to environmentally safe technologies - “green” technologies. The concept of “green technology” includes a set of signs of knowledge intended for practical use (“technology”). One of the ways to improve the quality of production is the use of surface plastic deformation (SPD) processing methods. The advantage of the SPD is a capability to combine effects of finishing and strengthening treatment. The SPD processing can replace operations: fine turning, grinding or polishing. The SPD is a forceful contact impact of indentor on workpiece’s surface in condition of their relative motion. It is difficult to implement the core technology of the SPD (burnishing, roller burnishing, etc.) while maintaining core technological advantages without the use of lubricating and cooling technology (metalworking fluids, MWF). The “green” SPD technology was developed by the authors for dry processing and has not such shortcomings. When processing with SPD without use of MWF requirements for tool’s durability is most significant, especially in the conditions of mass production. It is important to determine the period of durability of tool at the design stage of the technological process with the purpose of wastage preventing. This paper represents the results of durability research of natural and synthetic diamonds (polycrystalline diamond - ASPK) as well as precision of polycrystalline superabrasive tools made of dense boron nitride (DBN) during SPD processing without application of MWF.
Baggi, Fulvio; Mantegazza, Renato; Antozzi, Carlo; Sanders, Donald
Clinical registries may facilitate research on myasthenia gravis (MG) in several ways: as a source of demographic, clinical, biological, and immunological data on large numbers of patients with this rare disease; as a source of referrals for clinical trials; and by allowing rapid identification of MG patients with specific features. Physician-derived registries have the added advantage of incorporating diagnostic and treatment data that may allow comparison of outcomes from different therapeutic approaches, which can be supplemented with patient self-reported data. We report the demographic analysis of MG patients in two large physician-derived registries, the Duke MG Patient Registry, at the Duke University Medical Center, and the INNCB MG Registry, at the Istituto Neurologico Carlo Besta, as a preliminary study to assess the consistency of the two data sets. These registries share a common structure, with an inner core of common data elements (CDE) that facilitate data analysis. The CDEs are concordant with the MG-specific CDEs developed under the National Institute of Neurological Disorders and Stroke Common Data Elements Project. © 2012 New York Academy of Sciences.
Hunter, Jill V; Wilde, Elisabeth A; Tong, Karen A; Holshouser, Barbara A
This article identifies emerging neuroimaging measures considered by the inter-agency Pediatric Traumatic Brain Injury (TBI) Neuroimaging Workgroup. This article attempts to address some of the potential uses of more advanced forms of imaging in TBI as well as highlight some of the current considerations and unresolved challenges of using them. We summarize emerging elements likely to gain more widespread use in the coming years, because of 1) their utility in diagnosis, prognosis, and understanding the natural course of degeneration or recovery following TBI, and potential for evaluating treatment strategies; 2) the ability of many centers to acquire these data with scanners and equipment that are readily available in existing clinical and research settings; and 3) advances in software that provide more automated, readily available, and cost-effective analysis methods for large scale data image analysis. These include multi-slice CT, volumetric MRI analysis, susceptibility-weighted imaging (SWI), diffusion tensor imaging (DTI), magnetization transfer imaging (MTI), arterial spin tag labeling (ASL), functional MRI (fMRI), including resting state and connectivity MRI, MR spectroscopy (MRS), and hyperpolarization scanning. However, we also include brief introductions to other specialized forms of advanced imaging that currently do require specialized equipment, for example, single photon emission computed tomography (SPECT), positron emission tomography (PET), encephalography (EEG), and magnetoencephalography (MEG)/magnetic source imaging (MSI). Finally, we identify some of the challenges that users of the emerging imaging CDEs may wish to consider, including quality control, performing multi-site and longitudinal imaging studies, and MR scanning in infants and children.
Full Text Available Abstract Background Policy makers, clinicians and researchers are demonstrating increasing interest in using data linked from multiple sources to support measurement of clinical performance and patient health outcomes. However, the utility of data linkage may be compromised by sub-optimal or incomplete linkage, leading to systematic bias. In this study, we synthesize the evidence identifying participant or population characteristics that can influence the validity and completeness of data linkage and may be associated with systematic bias in reported outcomes. Methods A narrative review, using structured search methods was undertaken. Key words "data linkage" and Mesh term "medical record linkage" were applied to Medline, EMBASE and CINAHL databases between 1991 and 2007. Abstract inclusion criteria were; the article attempted an empirical evaluation of methodological issues relating to data linkage and reported on patient characteristics, the study design included analysis of matched versus unmatched records, and the report was in English. Included articles were grouped thematically according to patient characteristics that were compared between matched and unmatched records. Results The search identified 1810 articles of which 33 (1.8% met inclusion criteria. There was marked heterogeneity in study methods and factors investigated. Characteristics that were unevenly distributed among matched and unmatched records were; age (72% of studies, sex (50% of studies, race (64% of studies, geographical/hospital site (93% of studies, socio-economic status (82% of studies and health status (72% of studies. Conclusion A number of relevant patient or population factors may be associated with incomplete data linkage resulting in systematic bias in reported clinical outcomes. Readers should consider these factors in interpreting the reported results of data linkage studies.
McQuade, Sarah; Davis, Louise; Nash, Christine
Current thinking in coach education advocates mentoring as a development tool to connect theory and practice. However, little empirical evidence exists to evaluate the effectiveness of mentoring as a coach development tool. Business, education, and nursing precede the coaching industry in their mentoring practice, and research findings offered in…
Sacramento, Jose Miguel Noronha
This work aims to assist research institutes, notably the IPEN, in order to improve their assertiveness in the process of defining their research lines. New evolutionary speeds have increased exponentially requiring greater synchronism and multiple and coordinated action from the three fundamental elements in order to assure the development of the contemporary society: Government, Productive Structure and Infrastructure in Science and Technology. This environment increasingly dynamic and mutant imposes greater proximity with the socioeconomic environment when former client-consumer has become the co-creator of knowledge and supplier of energy now contained in a new standard of social relations, called Networked Society. The difference in time for the University, the Productive Structure and Government is function of its main activities: Science, Market and the achievement of Public Opinion, respectively. The equation that will harmonize and find synergies between these three dimensions is the contemporary challenge for those who seek to innovate and advance knowledge in order to improve the standard of living of the society. In this work is shown that research institutes must believe in the words of Robert Plomin and start connecting to the several links in different chains in order to make use of a collective intelligence that continuously expands in speed and quality higher than in any other time in human history. The comparison among the results obtained from the different methodologies of analysis proposed in this work allows finding out strengths and weaknesses, threats and opportunities of the IPEN providing subsidies in order to find better ways to tailor its performance to the new demands. (author)
Martins, Maria da Penha Sanches
This work presents a study of human factors and possible human failure reasons that can cause incidents, accidents and workers exposition, associated to risks intrinsic to the profession. The objective is to contribute with the operators of IEA-R1 reactor located at IPEN CNEN/S P. Accidents in the technological field, including the nuclear, have shown that the causes are much more connected to human failure than to system and equipment failures, what has led the regulatory bodies to consider studies on human failure. The research proposed in this work is quantitative/qualitative and also descriptive. Two questionnaires were used to collect data. The first of them was elaborated from the safety culture attributes which are described by the International Atomic Energy Agency - IAEA. The second considered individual and situational factors composing categories that could affect people in the work area. A carefully selected transcription of the theoretical basis according to the study of human factors was used. The methodology demonstrated a good reliability degree. Results lead to mediate factors which need direct actions concerning the needs of the group and of the individual. This research shows that it is necessary to have a really effective unit of planning and organization, not only to the physical and psychological health issues but also to the safety in the work. (author)
Fancy, Steven G.; Pank, Larry F.; Douglas, David C.; Curby, Catherine H.; Garner, Gerald W.; Amstrup, Steven C.; Regelin, Wayne L.
operation, the UHF (ultra-high frequency) signal failed on three of 32 caribou transmitters and 10 of 36 polar bear transmitters.A geographic information system (GIS) incorporating other databases (e.g., land cover, elevation, slope, aspect, hydrology, ice distribution) was used to analyze and display detailed locational and behavioral data collected via satellite. Examples of GIS applications to research projects using satellite telemetry and examples of detailed movement patterns of caribou and polar bears are presented. This report includes documentation for computer software packages for processing Argos data and presents developments, as of March 1987, in transmitter design, data retrieval using a local user terminal, computer software, and sensor development and calibration.
Crossley, Scott A.
This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…
Søndergaard, Lene Vammen; Dagnæs-Hansen, Frederik; Herskin, Mette S
of the extent of welfare assessment in pigs used in biomedical research and to suggest a welfare assessment standard for research facilities based on an exposition of ethological considerations relevant for the welfare of pigs in biomedical research. The tools for porcine welfare assessment presented suggest...
This article proposes a three-part conceptualisation of the use of Facebook in ethnographic research: as a tool, as data and as context. Longitudinal research with young adults at a time of significant change provides many challenges for the ethnographic researcher, such as maintaining channels of communication and high rates of participant…
The overarching goal of the MoDOT Pavement Preservation Research Program, Task 3: Pavement Evaluation Tools Data : Collection Methods was to identify and evaluate methods to rapidly obtain network-level and project-level information relevant to :...
Highlights: → A highly flexible neutronic core simulator was developed. → The tool estimates the static neutron flux, the eigenmodes, and the neutron noise. → The tool was successfully validated via many benchmark cases. → The tool can be used for research and education. → The tool is freely available. - Abstract: This paper deals with the development, validation, and demonstration of an innovative neutronic tool. The novelty of the tool resides in its versatility, since many different systems can be investigated and different kinds of calculations can be performed. More precisely, both critical systems and subcritical systems with an external neutron source can be studied, and static and dynamic cases in the frequency domain (i.e. for stationary fluctuations) can be considered. In addition, the tool has the ability to determine the different eigenfunctions of any nuclear core. For each situation, the static neutron flux, the different eigenmodes and eigenvalues, the first-order neutron noise, and their adjoint functions are estimated, as well as the effective multiplication factor of the system. The main advantages of the tool, which is entirely MatLab based, lie with the robustness of the implemented numerical algorithms, its high portability between different computer platforms and operative systems, and finally its ease of use since no input deck writing is required. The present version of the tool, which is based on two-group diffusion theory, is mostly suited to investigate thermal systems. The definition of both the static and dynamic core configurations directly from the static macroscopic cross-sections and their fluctuations, respectively, makes the tool particularly well suited for research and education. Some of the many benchmark cases used to validate the tool are briefly reported. The static and dynamic capabilities of the tool are also demonstrated for the following configurations: a vibrating control rod, a perturbation traveling upwards
Poduska, Jeanne; Kellam, Sheppard; Brown, C Hendricks; Ford, Carla; Windham, Amy; Keegan, Natalie; Wang, Wei
While a number of preventive interventions delivered within schools have shown both short-term and long-term impact in epidemiologically based randomized field trials, programs are not often sustained with high-quality implementation over time. This study was designed to support two purposes. The first purpose was to test the effectiveness of a universal classroom-based intervention, the Whole Day First Grade Program (WD), aimed at two early antecedents to drug abuse and other problem behaviors, namely, aggressive, disruptive behavior and poor academic achievement. The second purpose--the focus of this paper--was to examine the utility of a multilevel structure to support high levels of implementation during the effectiveness trial, to sustain WD practices across additional years, and to train additional teachers in WD practices. The WD intervention integrated three components, each previously tested separately: classroom behavior management; instruction, specifically reading; and family-classroom partnerships around behavior and learning. Teachers and students in 12 schools were randomly assigned to receive either the WD intervention or the standard first-grade program of the school system (SC). Three consecutive cohorts of first graders were randomized within schools to WD or SC classrooms and followed through the end of third grade to test the effectiveness of the WD intervention. Teacher practices were assessed over three years to examine the utility of the multilevel structure to support sustainability and scaling-up. The design employed in this trial appears to have considerable utility to provide data on WD effectiveness and to inform the field with regard to structures required to move evidence-based programs into practice. NCT00257088.
Full Text Available Abstract Background While a number of preventive interventions delivered within schools have shown both short-term and long-term impact in epidemiologically based randomized field trials, programs are not often sustained with high-quality implementation over time. This study was designed to support two purposes. The first purpose was to test the effectiveness of a universal classroom-based intervention, the Whole Day First Grade Program (WD, aimed at two early antecedents to drug abuse and other problem behaviors, namely, aggressive, disruptive behavior and poor academic achievement. The second purpose--the focus of this paper--was to examine the utility of a multilevel structure to support high levels of implementation during the effectiveness trial, to sustain WD practices across additional years, and to train additional teachers in WD practices. Methods The WD intervention integrated three components, each previously tested separately: classroom behavior management; instruction, specifically reading; and family-classroom partnerships around behavior and learning. Teachers and students in 12 schools were randomly assigned to receive either the WD intervention or the standard first-grade program of the school system (SC. Three consecutive cohorts of first graders were randomized within schools to WD or SC classrooms and followed through the end of third grade to test the effectiveness of the WD intervention. Teacher practices were assessed over three years to examine the utility of the multilevel structure to support sustainability and scaling-up. Discussion The design employed in this trial appears to have considerable utility to provide data on WD effectiveness and to inform the field with regard to structures required to move evidence-based programs into practice. Trial Registration Clinical Trials Registration Number: NCT00257088
Showcasing exemplars of how various aspects of design research were successfully transitioned into and influenced, design practice, this book features chapters written by eminent international researchers and practitioners from industry on the Impact of Design Research on Industrial Practice. Chapters written by internationally acclaimed researchers of design analyse the findings (guidelines, methods and tools), technologies/products and educational approaches that have been transferred as tools, technologies and people to transform industrial practice of engineering design, whilst the chapters that are written by industrial practitioners describe their experience of how various tools, technologies and training impacted design practice. The main benefit of this book, for educators, researchers and practitioners in (engineering) design, will be access to a comprehensive coverage of case studies of successful transfer of outcomes of design research into practice; as well as guidelines and platforms for successf...
Aguiar, R R; Ambrosio, L A; Sepúlveda-Hermosilla, G; Maracaja-Coutinho, V; Paschoal, A R
This report describes the miRQuest - a novel middleware available in a Web server that allows the end user to do the miRNA research in a user-friendly way. It is known that there are many prediction tools for microRNA (miRNA) identification that use different programming languages and methods to realize this task. It is difficult to understand each tool and apply it to diverse datasets and organisms available for miRNA analysis. miRQuest can easily be used by biologists and researchers with limited experience with bioinformatics. We built it using the middleware architecture on a Web platform for miRNA research that performs two main functions: i) integration of different miRNA prediction tools for miRNA identification in a user-friendly environment; and ii) comparison of these prediction tools. In both cases, the user provides sequences (in FASTA format) as an input set for the analysis and comparisons. All the tools were selected on the basis of a survey of the literature on the available tools for miRNA prediction. As results, three different cases of use of the tools are also described, where one is the miRNA identification analysis in 30 different species. Finally, miRQuest seems to be a novel and useful tool; and it is freely available for both benchmarking and miRNA identification at http://mirquest.integrativebioinformatics.me/.
Morguí, Josep-Anton; Font, Anna; Cañas, Lidia; Vázquez-García, Eusebi; Gini, Andrea; Corominas, Ariadna; Àgueda, Alba; Lobo, Agustin; Ferraz, Carlos; Nofuentes, Manel; Ulldemolins, Delmir; Roca, Alex; Kamnang, Armand; Grossi, Claudia; Curcoll, Roger; Batet, Oscar; Borràs, Silvia; Occhipinti, Paola; Rodó, Xavier
An educational tool was designed with the aim of making more comprehensive the research done on Greenhouse Gases (GHGs) in the ClimaDat Spanish network of atmospheric observation stations (www.climadat.es). This tool is called Air Enquirer and it consist of a multi-sensor box. It is envisaged to build more than two hundred boxes to yield them to the Spanish High Schools through the Education department (www.educaixa.com) of the "Obra Social 'La Caixa'", who funds this research. The starting point for the development of the Air Enquirers was the experience at IC3 (www.ic3.cat) in the CarboSchools+ FP7 project (www.carboschools.cat, www.carboschools.eu). The Air Enquirer's multi-sensor box is based in Arduino's architecture and contains sensors for CO2, temperature, relative humidity, pressure, and both infrared and visible luminance. The Air Enquirer is designed for taking continuous measurements. Every Air Enquirer ensemble of measurements is used to convert values to standard units (water content in ppmv, and CO2 in ppmv_dry). These values are referred to a calibration made with Cavity Ring Down Spectrometry (Picarro®) under different temperature, pressure, humidity and CO2 concentrations. Multiple sets of Air Enquirers are intercalibrated for its use in parallel during the experiments. The different experiments proposed to the students will be outdoor (observational) or indoor (experimental, in the lab) focusing on understanding the biogeochemistry of GHGs in the ecosystems (mainly CO2), the exchange (flux) of gases, the organic matter production, respiration and decomposition processes, the influence of the anthropogenic activities on the gases (and particles) exchanges, and their interaction with the structure and composition of the atmosphere (temperature, water content, cooling and warming processes, radiative forcing, vertical gradients and horizontal patterns). In order to ensure Air Enquirers a high-profile research performance the experimental designs
Boltz, S.; Macdonald, B. D.; Orr, T.; Johnson, W.; Benton, D. J.
Researchers with the National Institute for Occupational Safety and Health are conducting research at a deep, underground metal mine in Idaho to develop improvements in ground control technologies that reduce the effects of dynamic loading on mine workings, thereby decreasing the risk to miners. This research is multifaceted and includes: photogrammetry, microseismic monitoring, geotechnical instrumentation, and numerical modeling. When managing research involving such a wide range of data, understanding how the data relate to each other and to the mining activity quickly becomes a daunting task. In an effort to combine this diverse research data into a single, easy-to-use system, a three-dimensional visualization tool was developed. The tool was created using the Unity3d video gaming engine and includes the mine development entries, production stopes, important geologic structures, and user-input research data. The tool provides the user with a first-person, interactive experience where they are able to walk through the mine as well as navigate the rock mass surrounding the mine to view and interpret the imported data in the context of the mine and as a function of time. The tool was developed using data from a single mine; however, it is intended to be a generic tool that can be easily extended to other mines. For example, a similar visualization tool is being developed for an underground coal mine in Colorado. The ultimate goal is for NIOSH researchers and mine personnel to be able to use the visualization tool to identify trends that may not otherwise be apparent when viewing the data separately. This presentation highlights the features and capabilities of the mine visualization tool and explains how it may be used to more effectively interpret data and reduce the risk of ground fall hazards to underground miners.
Kriston, Levente; Melchior, Hanne; Hergert, Anika; Bergelt, Corinna; Watzke, Birgit; Schulz, Holger; von Wolff, Alessa
The aim of our study was to develop a graphical tool that can be used in addition to standard statistical criteria to support decisions on the number of classes in explorative categorical latent variable modeling for rehabilitation research. Data from two rehabilitation research projects were used. In the first study, a latent profile analysis was…
Wight, Evelyn; Gardner, Gene; Harvey, Tony
As a reflection of its growing culture of openness, and in response to the public's need for accurate information about its activities, the U.S. Department of Energy (DOE) Office of the Assistant Secretary for Environmental Restoration and Waste Management (EM) has increased the amount of information available to the public through communication tools such as brochures, fact sheets, and a travelling exhibit with an interactive computer display. Our involvement with this effort has been to design, develop, and critique booklets, brochures, fact sheets and other communication tools for EM. This paper presents an evaluation of the effectiveness of two communication tools we developed: the EM Booklet and the EM Fact Sheets. We measured effectiveness using non-parametric testing. This paper describes DOE's culture change, EM's communication tools and their context within DOE'S new open culture, our research, test methods and results, the significance of our research, and our plans for future research. (author)
Improving Aboriginal maternal and infant health services in the 'Top End' of Australia; synthesis of the findings of a health services research program aimed at engaging stakeholders, developing research capacity and embedding change.
Barclay, Lesley; Kruske, Sue; Bar-Zeev, Sarah; Steenkamp, Malinda; Josif, Cathryn; Narjic, Concepta Wulili; Wardaguga, Molly; Belton, Suzanne; Gao, Yu; Dunbar, Terry; Kildea, Sue
Health services research is a well-articulated research methodology and can be a powerful vehicle to implement sustainable health service reform. This paper presents a summary of a five-year collaborative program between stakeholders and researchers that led to sustainable improvements in the maternity services for remote-dwelling Aboriginal women and their infants in the Top End (TE) of Australia. A mixed-methods health services research program of work was designed, using a participatory approach. The study area consisted of two large remote Aboriginal communities in the Top End of Australia and the hospital in the regional centre (RC) that provided birth and tertiary care for these communities. The stakeholders included consumers, midwives, doctors, nurses, Aboriginal Health Workers (AHW), managers, policy makers and support staff. Data were sourced from: hospital and health centre records; perinatal data sets and costing data sets; observations of maternal and infant health service delivery and parenting styles; formal and informal interviews with providers and women and focus groups. Studies examined: indicator sets that identify best care, the impact of quality of care and remoteness on health outcomes, discrepancies in the birth counts in a range of different data sets and ethnographic studies of 'out of hospital' or health centre birth and parenting. A new model of maternity care was introduced by the health service aiming to improve care following the findings of our research. Some of these improvements introduced during the five-year research program of research were evaluated. Cost effective improvements were made to the acceptability, quality and outcomes of maternity care. However, our synthesis identified system-wide problems that still account for poor quality of infant services, specifically, unacceptable standards of infant care and parent support, no apparent relationship between volume and acuity of presentations and staff numbers with the
Adeline Phaik Harn Chua
Full Text Available Blogs appear to be gaining momentum as a marketing tool which can be used by organisations for such strategies and processes as branding, managing reputation, developing customer trust and loyalty, niche marketing, gathering marketing intelligence and promoting their online presence. There has been limited academic research in this area, and most significantly concerning the types of small and medium enterprises (SMEs for which blogs might have potential as a marketing tool. In an attempt to address the knowledge gap, this paper presents a future research agenda (in the form of research questions which can guide the eBusiness research community in conducting much needed studies in this area. This paper is particularly novel in that it aims to demonstrate how the heterogeneity of SMEs and their specific business uses of eBusiness technology such as blogs can form the central plank of a future research agenda. This is important because the existing eBusiness literature tends to treat eBusiness collectively rather than focusing on the specific business uses of different eBusiness technologies, and to treat SMEs as a homogeneous group. The paper concludes with a discussion of how this research agenda can form the basis of studies which use a range of different research methods, and how this "big picture" agenda approach might help the eBusiness research community build theory which better explains SME adoption and use of eBusiness.
Maar, Marion; Yeates, Karen; Barron, Marcia; Hua, Diane; Liu, Peter; Moy Lum-Kwong, Margaret; Perkins, Nancy; Sleeth, Jessica; Tobe, Joshua; Wabano, Mary Jo; Williamson, Pamela; Tobe, Sheldon W
Non-communicable chronic diseases are the leading causes of mortality globally, and nearly 80% of these deaths occur in low- and middle-income countries (LMICs). In high-income countries (HICs), inequitable distribution of resources affects poorer and otherwise disadvantaged groups including Aboriginal peoples. Cardiovascular mortality in high-income countries has recently begun to fall; however, these improvements are not realized among citizens in LMICs or those subgroups in high-income countries who are disadvantaged in the social determinants of health including Aboriginal people. It is critical to develop multi-faceted, affordable and realistic health interventions in collaboration with groups who experience health inequalities. Based on community-based participatory research (CBPR), we aimed to develop implementation tools to guide complex interventions to ensure that health gains can be realized in low-resource environments. We developed the I-RREACH (Intervention and Research Readiness Engagement and Assessment of Community Health Care) tool to guide implementation of interventions in low-resource environments. We employed CBPR and a consensus methodology to (1) develop the theoretical basis of the tool and (2) to identify key implementation factor domains; then, we (3) collected participant evaluation data to validate the tool during implementation. The I-RREACH tool was successfully developed using a community-based consensus method and is rooted in participatory principles, equalizing the importance of the knowledge and perspectives of researchers and community stakeholders while encouraging respectful dialogue. The I-RREACH tool consists of three phases: fact finding, stakeholder dialogue and community member/patient dialogue. The evaluation for our first implementation of I-RREACH by participants was overwhelmingly positive, with 95% or more of participants indicating comfort with and support for the process and the dialogue it creates. The I
Mishra, P; Patankar, A; Etmektzoglou, A; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States); Lewis, J [Brigham and Women’s Hospital, Boston, MA (United States)
Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verified via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto.
Mishra, P; Patankar, A; Etmektzoglou, A; Svatos, M; Lewis, J
Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verified via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto
Murawska, Jaclyn M.; Walker, David A.
In this commentary, we offer a set of visual tools that can assist education researchers, especially those in the field of mathematics, in developing cohesiveness from a mixed methods perspective, commencing at a study's research questions and literature review, through its data collection and analysis, and finally to its results. This expounds…
Asselin, Marlene; Moayeri, Maryam
Competency in the new literacies of the Internet is essential for participating in contemporary society. Researchers studying these new literacies are recognizing the limitations of traditional methodological tools and adapting new technologies and new media for use in research. This paper reports our exploration of usability testing software to…
Full Text Available The massive growth of and access to information technology (IT has enabled the integration of technology into classrooms. One such integration is the use of WebQuests as an instructional tool in teaching targeted learning activities such as writing abstracts of research articles in English for English as a Foreign Language (EFL learners. In the academic world, writing an abstract of a research paper or final project in English can be challenging for EFL students. This article presents an action research project on the process and outcomes of using a WebQuest designed to help 20 Indonesian university IT students write a research article’s abstract in English. Findings reveal that despite positive feedback, changes need to be made to make the WebQuest a more effective instructional tool for the purpose it was designed.
Philip W. Gassman; Manuel R. Reyes; Colleen H. Green; Jeffrey G. Arnold
The Soil and Water Assessment Tool (SWAT) model is a continuation of nearly 30 years of modeling efforts conducted by the U.S. Department of Agriculture (USDA), Agricultural Research Service. SWAT has gained international acceptance as a robust interdisciplinary watershed modeling tool, as evidenced by international SWAT conferences, hundreds of SWAT-related papers presented at numerous scientific meetings, and dozens of articles published in peer-reviewed journals. The model has also been ad...
Full Text Available The production lines used for manufacturing U-shaped profiles are very complex and they must have high productivity. One of the most important stages of the fabrication process is the cutting-off. This paper presents the experimental research and analysis of the durability of the cutting tools used for cutting-off U-shaped metal steel profiles. The results of this work can be used to predict the durability of the cutting tools.
Weidema, Froukje C; Molewijk, Bert A C; Kamsteeg, Frans; Widdershoven, Guy A M
Deliberative ways of dealing with ethical issues in health care are expanding. Moral case deliberation is an example, providing group-wise, structured reflection on dilemmas from practice. Although moral case deliberation is well described in literature, aims and results of moral case deliberation sessions are unknown. This research shows (a) why managers introduce moral case deliberation and (b) what moral case deliberation participants experience as moral case deliberation results. A responsive evaluation was conducted, explicating moral case deliberation experiences by analysing aims (N = 78) and harvest (N = 255). A naturalistic data collection included interviews with managers and evaluation questionnaires of moral case deliberation participants (nurses). From the analysis, moral case deliberation appeals for cooperation, team bonding, critical attitude towards routines and nurses' empowerment. Differences are that managers aim to foster identity of the nursing profession, whereas nurses emphasize learning processes and understanding perspectives. We conclude that moral case deliberation influences team cooperation that cannot be controlled with traditional management tools, but requires time and dialogue. Exchanging aims and harvest between manager and team could result in co-creating (moral) practice in which improvements for daily cooperation result from bringing together perspectives of managers and team members.
Casanovas-Rubio, Maria del Mar; Ahearn, Alison; Ramos, Gonzalo; Popo-Ola, Sunday
In principle, the research-teaching nexus should be seen as a two-way link, showing not only ways in which research supports teaching but also ways in which teaching supports research. In reality, the discussion has been limited almost entirely to the first of these practices. This paper presents a case study in which some student field-trip…
Zwoll, K.; Mueller, K.D.; Becks, B.; Erven, W.; Sauer, M.
The production of mechanical parts in research centers can be improved by connecting several numerically controlled machine tools to a central process computer via a data link. The CAMAC Serial Highway with its expandable structure yields an economic and flexible system for this purpose. The CAMAC System also facilitates the development of modular components controlling the machine tools itself. A CAMAC installation controlling three different machine tools connected to a central computer (PDP11) via the CAMAC Serial Highway is described. Besides this application, part of the CAMAC hardware and software can also be used for a great variety of scientific experiments
Tahmasebi, Farhad; Pearce, Robert
Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. For efficiency and speed, the tool takes advantage of a function developed in Excels Visual Basic for Applications. The strategic planning process for determining the community Outcomes is also briefly discussed. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples of using the tool are also presented.
Full Text Available Powder brazing filler metals (PBFMs feature a number of comparative advantages. Among others, these include a low energy consumption, an accurate dosage, a good brazeability, a short production time, and a high production efficiency. These filler metals have been used in the aerospace, automobile, and electric appliances industries. The PBFMs are especially suitable for diamond tools bonding, which involves complex workpiece shapes and requires accurate dosage. The recent research of PBFMs for diamond tools is reviewed in this paper. The current applications are discussed. The CuSnTi and Ni-Cr-based PBFMs have been the two commonly used monolayer PBFMs. Thus, the bonding mechanism at the interface between both the monolayer PBFMs and a diamond tool are summarized first. The ways to improve the performance of the monolayer PBFMs for diamond tools are analyzed. Next, a research of PBFMs for impregnated diamond tools is reviewed. The technical problems that urgently need solutions are discussed. Finally, the challenges and opportunities involved with the PBFMs for diamond tools research and development are summarized, and corresponding prospects are suggested.
Fortuna, Cinira Magali; Mesquita, Luana Pinho de; Matumoto, Silvia; Monceau, Gilles
This qualitative study is based on institutional analysis as the methodological theoretical reference with the objective of analyzing researchers' implication during a research-intervention and the interferences caused by this analysis. The study involved researchers from courses in medicine, nursing, and dentistry at two universities and workers from a Regional Health Department in follow-up on the implementation of the Stork Network in São Paulo State, Brazil. The researchers worked together in the intervention and in analysis workshops, supported by an external institutional analysis. Two institutions stood out in the analysis: the research, established mainly with characteristics of neutrality, and management, with Taylorist characteristics. Differences between researchers and difficulties in identifying actions proper to network management and research were some of the interferences that were identified. The study concludes that implication analysis is a powerful tool for such studies.
Engholm, Gerda; Ferlay, Jacques; Christensen, Niels; Bray, Freddie; Gjerstorff, Marianne L; Klint, Asa; Køtlum, Jóanis E; Olafsdóttir, Elínborg; Pukkala, Eero; Storm, Hans H
The NORDCAN database and program ( www.ancr.nu ) include detailed information and results on cancer incidence, mortality and prevalence in each of the Nordic countries over five decades and has lately been supplemented with predictions of cancer incidence and mortality; future extensions include the incorporation of cancer survival estimates. The data originates from the national cancer registries and causes of death registries in Denmark, Finland, Iceland, Norway, Sweden, and Faroe Islands and is regularly updated. Presently 41 cancer entities are included in the common dataset, and conversions of the original national data according to international rules ensure comparability. With 25 million inhabitants in the Nordic countries, 130 000 incident cancers are reported yearly, alongside nearly 60 000 cancer deaths, with almost a million persons living with a cancer diagnosis. This web-based application is available in English and in each of the five Nordic national languages. It includes comprehensive and easy-to-use descriptive epidemiology tools that provide tabulations and graphs, with further user-specified options available. The NORDCAN database aims to provide comparable and timely data to serve the varying needs of policy makers, cancer societies, the public, and journalists, as well as the clinical and research community.
Full Text Available Hypersensitivity to external sounds is often comorbid with tinnitus and may be significant for adherence to certain types of tinnitus management. Therefore, a clear measure of sensitivity to sound is important. The aim of this study was to evaluate the validity and reliability of the Hyperacusis Questionnaire (HQ for use as a measurement tool using data from a sample of 264 adults who took part in tinnitus research. We evaluated the HQ factor structure, internal consistency, convergent and discriminant validity, and floor and ceiling effects. Internal consistency was high (Cronbach’s alpha = 0.88 and moderate correlations were observed between the HQ, uncomfortable loudness levels, and other health questionnaires. Confirmatory factor analysis revealed that the original HQ three-factor solution and a one-factor solution were both a poor fit to the data. Four problematic items were removed and exploratory factor analysis identified a two-factor (attentional and social solution. The original three-factor structure of the HQ was not confirmed. All fourteen items do not accurately assess hypersensitivity to sound in a tinnitus population. We propose a 10-item (2-factor version of the HQ, which will need to be confirmed using a new tinnitus and perhaps nontinnitus population.
Fackrell, Kathryn; Fearnley, Constance; Hoare, Derek J; Sereda, Magdalena
Hypersensitivity to external sounds is often comorbid with tinnitus and may be significant for adherence to certain types of tinnitus management. Therefore, a clear measure of sensitivity to sound is important. The aim of this study was to evaluate the validity and reliability of the Hyperacusis Questionnaire (HQ) for use as a measurement tool using data from a sample of 264 adults who took part in tinnitus research. We evaluated the HQ factor structure, internal consistency, convergent and discriminant validity, and floor and ceiling effects. Internal consistency was high (Cronbach's alpha = 0.88) and moderate correlations were observed between the HQ, uncomfortable loudness levels, and other health questionnaires. Confirmatory factor analysis revealed that the original HQ three-factor solution and a one-factor solution were both a poor fit to the data. Four problematic items were removed and exploratory factor analysis identified a two-factor (attentional and social) solution. The original three-factor structure of the HQ was not confirmed. All fourteen items do not accurately assess hypersensitivity to sound in a tinnitus population. We propose a 10-item (2-factor) version of the HQ, which will need to be confirmed using a new tinnitus and perhaps nontinnitus population.
Doménech, Jesús; Genaim, Samir; Johnsen, Einar Broch; Schlatte, Rudolf
In this paper we describe EasyInterface, an open-source toolkit for rapid development of web-based graphical user interfaces (GUIs). This toolkit addresses the need of researchers to make their research prototype tools available to the community, and integrating them in a common environment, rapidly and without being familiar with web programming or GUI libraries in general. If a tool can be executed from a command-line and its output goes to the standard output, then in few minutes one can m...
POP Nicolae Al.
Full Text Available Starting from the meaning of the communication process in marketing, the authors try to identify its role in assuring the continuity of the management process in what concerns the relationships between all the partners of the company, on the long term. An emphasis is made on the role of online communication and its tools in relationship marketing. In order to validate some of the mentioned ideas the authors have chosen to undertake a qualitative marketing research among the managers of some Romanian tourism companies. The qualitative part of the study had as purpose the identification of the main tools which form the basis of the communication with the beneficiaries of the touristic services, of the way in which the companies use the online communication tools for attracting, keeping and developing the long term relationships with their customers in the virtual environment. The following tools have been analyzed: websites, email marketing campaigns, e-newsletters, online advertising, search engines, sponsored links, blogs, RSS feed, social networks, forums, online discussion groups, portals, infomediaries and instant messaging. The chosen investigation method was the selective survey, the research technique - explorative interrogation and the research instrument - semi structured detailed interview, based on a conversation guide. A very important fact is the classification resulted after the respondents were requested to mention the most efficient tools for attracting customers and for maintaining the relationships with them. Although the notoriety of the online marketing tools is high, there are some tools that are known by definition, but are not used at all or are not used correctly; or are not known by definition, but are used in practice. The authors contributed by validating a performing methodology of qualitative research, a study which will open new ways and means for making the online communication tools used for touristic services in
Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D’Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel
Background International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. Objectives This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. Methods We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants’ comprehension of the study information was measured by using a validated digitised audio questionnaire. Results The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants’ ‘recall’ and ‘understanding’ between first and second visits were statistically significant (F (1,41)=25.38, pmultimedia tool was acceptable and easy to administer among low literacy participants in The Gambia. It also proved to be effective in delivering and sustaining comprehension of study information across a diverse group of participants. Additional research is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings. PMID:25133065
Macdermid, Joy C; Miller, Jordan; Gross, Anita R
Development or synthesis of the best clinical research is in itself insufficient to change practice. Knowledge translation (KT) is an emerging field focused on moving knowledge into practice, which is a non-linear, dynamic process that involves knowledge synthesis, transfer, adoption, implementation, and sustained use. Successful implementation requires using KT strategies based on theory, evidence, and best practice, including tools and processes that engage knowledge developers and knowledge users. Tools can provide instrumental help in implementing evidence. A variety of theoretical frameworks underlie KT and provide guidance on how tools should be developed or implemented. A taxonomy that outlines different purposes for engaging in KT and target audiences can also be useful in developing or implementing tools. Theoretical frameworks that underlie KT typically take different perspectives on KT with differential focus on the characteristics of the knowledge, knowledge users, context/environment, or the cognitive and social processes that are involved in change. Knowledge users include consumers, clinicians, and policymakers. A variety of KT tools have supporting evidence, including: clinical practice guidelines, patient decision aids, and evidence summaries or toolkits. Exemplars are provided of two KT tools to implement best practice in management of neck pain-a clinician implementation guide (toolkit) and a patient decision aid. KT frameworks, taxonomies, clinical expertise, and evidence must be integrated to develop clinical tools that implement best evidence in the management of neck pain.
Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E
A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi
For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques  ; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.
Li, Man; Pickering, Brian W; Smith, Vernon D; Hadzikadic, Mirsad; Gajic, Ognjen; Herasevich, Vitaly
Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR) in complex environments such as intensive care units (ICU). We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and administrative data from heterogeneous sources within the EMR to support research and practice improvement in the ICUs. Examples of intelligent alarms -- "sniffers", administrative reports, decision support and clinical research applications are presented.
Full Text Available Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR in complex environments such as intensive care units (ICU. We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and administrative data from heterogeneous sources within the EMR to support research and practice improvement in the ICUs. Examples of intelligent alarms – “sniffers”, administrative reports, decision support and clinical research applications are presented.
Li, Man; Pickering, Brian W.; Smith, Vernon D.; Hadzikadic, Mirsad; Gajic, Ognjen; Herasevich, Vitaly
Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR) in complex environments such as intensive care units (ICU). We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and adminis...
Man Li; Brian W. Pickering; Vernon D. Smith; Mirsad Hadzikadic; Ognjen Gajic; Vitaly Herasevich
Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR) in complex environments such as intensive care units (ICU). We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and adminis...
James S. Bates
Researchers, educators, and practitioners utilize a range of tools and techniques to obtain data, input, feedback, and information from research participants, program learners, and stakeholders. Ketso is both an array of information gathering techniques and a toolkit (see www.ketso.com). It “can be used in any situation when people come together to share information, learn from each other, make decisions and plan actions” (Tippett & How, 2011, p. 4). The word ketso means “action” in the Sesot...
Mador, Rebecca L; Kornas, Kathy; Simard, Anne; Haroun, Vinita
Given the context-specific nature of health research prioritization and the obligation to effectively allocate resources to initiatives that will achieve the greatest impact, evaluation of priority setting processes can refine and strengthen such exercises and their outcomes. However, guidance is needed on evaluation tools that can be applied to research priority setting. This paper describes the adaption and application of a conceptual framework to evaluate a research priority setting exercise operating within the public health sector in Ontario, Canada. The Nine Common Themes of Good Practice checklist, described by Viergever et al. (Health Res Policy Syst 8:36, 2010) was used as the conceptual framework to evaluate the research priority setting process developed for the Locally Driven Collaborative Projects (LDCP) program in Ontario, Canada. Multiple data sources were used to inform the evaluation, including a review of selected priority setting approaches, surveys with priority setting participants, document review, and consultation with the program advisory committee. The evaluation assisted in identifying improvements to six elements of the LDCP priority setting process. The modifications were aimed at improving inclusiveness, information gathering practices, planning for project implementation, and evaluation. In addition, the findings identified that the timing of priority setting activities and level of control over the process were key factors that influenced the ability to effectively implement changes. The findings demonstrate the novel adaptation and application of the 'Nine Common Themes of Good Practice checklist' as a tool for evaluating a research priority setting exercise. The tool can guide the development of evaluation questions and enables the assessment of key constructs related to the design and delivery of a research priority setting process.
Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.
Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256
Trexler, Grant Lewis
This dissertation set out to identify effective qualitative and quantitative management tools used by financial officers (CFOs) in carrying out their management functions of planning, decision making, organizing, staffing, communicating, motivating, leading and controlling at a public research university. In addition, impediments to the use of…
Narasimharao, B. Pandu, Ed.; Wright, Elizabeth, Ed.; Prasad, Shashidhara, Ed.; Joshi, Meghana, Ed.
Higher education institutions play a vital role in their surrounding communities. Besides providing a space for enhanced learning opportunities, universities can utilize their resources for social and economic interests. The "Handbook of Research on Science Education and University Outreach as a Tool for Regional Development" is a…
Meyer, R.A.; Tirsell, K.G.; Armantrout, G.A.
Four areas of research that will have significant impact on the further development of γ-ray spectroscopy as an accurate analytical tool are considered. The areas considered are: (1) automation; (2) accurate multigamma ray sources; (3) accuracy of the current and future γ-ray energy scale, and (4) new solid state X and γ-ray detectors
Tahmasebi, Farhad; Pearce, Robert
Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.
The article is a reflection on the use of an oral diary as a qualitative research tool, the role that it played during fieldwork and the methodological issues that emerged. It draws on a small-scale empirical study into primary school teachers' use of group discussion, during which oral diaries were used to explore and document teacher reflective…
Rosen, Yigel, Ed.; Ferrara, Steve, Ed.; Mosharraf, Maryam, Ed.
Education is expanding to include a stronger focus on the practical application of classroom lessons in an effort to prepare the next generation of scholars for a changing world economy centered on collaborative and problem-solving skills for the digital age. "The Handbook of Research on Technology Tools for Real-World Skill Development"…
Roerig, S.; Evers, S.J.T.M.; Krabbendam, L.
The relation between theatre, or drama, and research is not novel which is illustrated by concepts such as role theory, theatre for development, or distancing in drama therapy. In various scientific fields theatre is used as a communicative and/or educative tool, however in the realm of childhood
Pitsch, Karola; Neumann, Alexander; Schnier, Christian; Hermann, Thomas
We suggest that an Augmented Reality (AR) system for coupled interaction partners provides a new tool for linguistic research that allows to manipulate the coparticipants’ real-time perception and action. It encompasses novel facilities for recording heterogeneous sensor-rich data sets to be accessed in parallel with qualitative/manual and quantitative/computational methods.
Bacon, Donald R.; Paul, Pallab; Stewart, Kim A.; Mukhopadhyay, Kausiki
Much has been written about the evaluation of faculty research productivity in promotion and tenure decisions, including many articles that seek to determine the rank of various marketing journals. Yet how faculty evaluators combine journal quality, quantity, and author contribution to form judgments of a scholar's performance is unclear. A…
Virtual Globes are a paradigm shift in the way earth sciences are conducted. With these tools, nearly all aspects of earth science can be integrated from field science, to remote sensing, to remote collaborations, to logistical planning, to data archival/retrieval, to PDF paper retriebal, to education and outreach. Here we present an example of how VGs can be fully exploited for field sciences, using research at McCall Glacier, in Arctic Alaska.
Mulligan, Angela A; Luben, Robert N; Bhaniani, Amit; Parry-Smith, David J; O'Connor, Laura; Khawaja, Anthony P; Forouhi, Nita G; Khaw, Kay-Tee
To describe the research methods for the development of a new open source, cross-platform tool which processes data from the European Prospective Investigation into Cancer and Nutrition Norfolk Food Frequency Questionnaire (EPIC-Norfolk FFQ). A further aim was to compare nutrient and food group values derived from the current tool (FETA, FFQ EPIC Tool for Analysis) with the previously validated but less accessible tool, CAFÉ (Compositional Analyses from Frequency Estimates). The effect of text matching on intake data was also investigated. Cross-sectional analysis of a prospective cohort study-EPIC-Norfolk. East England population (city of Norwich and its surrounding small towns and rural areas). Complete FFQ data from 11 250 men and 13 602 women (mean age 59 years; range 40-79 years). Nutrient and food group intakes derived from FETA and CAFÉ analyses of EPIC-Norfolk FFQ data. Nutrient outputs from FETA and CAFÉ were similar; mean (SD) energy intake from FETA was 9222 kJ (2633) in men, 8113 kJ (2296) in women, compared with CAFÉ intakes of 9175 kJ (2630) in men, 8091 kJ (2298) in women. The majority of differences resulted in one or less quintile change (98.7%). Only mean daily fruit and vegetable food group intakes were higher in women than in men (278 vs 212 and 284 vs 255 g, respectively). Quintile changes were evident for all nutrients, with the exception of alcohol, when text matching was not executed; however, only the cereals food group was affected. FETA produces similar nutrient and food group values to the previously validated CAFÉ but has the advantages of being open source, cross-platform and complete with a data-entry form directly compatible with the software. The tool will facilitate research using the EPIC-Norfolk FFQ, and can be customised for different study populations.
Du, Z C; Lv, C F; Hong, M S
A new error modelling and identification method based on the cross grid encoder is proposed in this paper. Generally, there are 21 error components in the geometric error of the 3 axis NC machine tools. However according our theoretical analysis, the squareness error among different guide ways affects not only the translation error component, but also the rotational ones. Therefore, a revised synthetic error model is developed. And the mapping relationship between the error component and radial motion error of round workpiece manufactured on the NC machine tools are deduced. This mapping relationship shows that the radial error of circular motion is the comprehensive function result of all the error components of link, worktable, sliding table and main spindle block. Aiming to overcome the solution singularity shortcoming of traditional error component identification method, a new multi-step identification method of error component by using the Cross Grid Encoder measurement technology is proposed based on the kinematic error model of NC machine tool. Firstly, the 12 translational error components of the NC machine tool are measured and identified by using the least square method (LSM) when the NC machine tools go linear motion in the three orthogonal planes: XOY plane, XOZ plane and YOZ plane. Secondly, the circular error tracks are measured when the NC machine tools go circular motion in the same above orthogonal planes by using the cross grid encoder Heidenhain KGM 182. Therefore 9 rotational errors can be identified by using LSM. Finally the experimental validation of the above modelling theory and identification method is carried out in the 3 axis CNC vertical machining centre Cincinnati 750 Arrow. The entire 21 error components have been successfully measured out by the above method. Research shows the multi-step modelling and identification method is very suitable for 'on machine measurement'
Seul, M.; Brazil, L.; Castronova, A. M.
CUAHSI Data Services: Tools and Cyberinfrastructure for Water Data Discovery, Research and CollaborationEnabling research surrounding interdisciplinary topics often requires a combination of finding, managing, and analyzing large data sets and models from multiple sources. This challenge has led the National Science Foundation to make strategic investments in developing community data tools and cyberinfrastructure that focus on water data, as it is central need for many of these research topics. CUAHSI (The Consortium of Universities for the Advancement of Hydrologic Science, Inc.) is a non-profit organization funded by the National Science Foundation to aid students, researchers, and educators in using and managing data and models to support research and education in the water sciences. This presentation will focus on open-source CUAHSI-supported tools that enable enhanced data discovery online using advanced searching capabilities and computational analysis run in virtual environments pre-designed for educators and scientists so they can focus their efforts on data analysis rather than IT set-up.
Marty firms offer Information Technology Research reports, analyst calls, conferences, seminars, tools, leadership development, etc. These entities include Gartner, Forrester Research, IDC, The Burton Group, Society for Information Management, 1nfoTech Research, The Corporate Executive Board, and so on. This talk will cover how a number of such services are being used at the Goddard Space Flight Center to improve our IT management practices, workforce skills, approach to innovation, and service delivery. These tools and services are used across the workforce, from the executive leadership to the IT worker. The presentation will cover the types of services each vendor provides and their primary engagement model. The use of these services at other NASA Centers and Headquarters will be included. In addition, I will explain how two of these services are available now to the entire NASA IT workforce through enterprise-wide subscriptions.
Full Text Available Gerald (Gerry Rubin, pioneer in Drosophila genetics, is Founding Director of the HHMI-funded Janelia Research Campus. In this interview, Gerry recounts key events and collaborations that have shaped his unique approach to scientific exploration, decision-making, management and mentorship – an approach that forms the cornerstone of the model adopted at Janelia to tackle problems in interdisciplinary biomedical research. Gerry describes his remarkable journey from newcomer to internationally renowned leader in the fly field, highlighting his contributions to the tools and resources that have helped establish Drosophila as an important model in translational research. Describing himself as a ‘tool builder’, his current focus is on developing approaches for in-depth study of the fly nervous system, in order to understand key principles in neurobiology. Gerry was interviewed by Ross Cagan, Senior Editor of Disease Models & Mechanisms.
's claim by fellow scientists, and (3) demonstrate the utility and value of the research contribution to any interested parties. However, turning an exploratory prototype into a “proper” tool for end-users often entails great effort. Heavyweight mainstream frameworks such as Eclipse do not address...... this issue; their steep learning curves constitute substantial entry barriers to such ecosystems. In this paper, we present the Model Analyzer/Checker (MACH), a stand-alone tool with a command-line interpreter. MACH integrates a set of research prototypes for analyzing UML models. By choosing a simple...... command line interpreter rather than (costly) graphical user interface, we achieved the core goal of quickly deploying research results to a broader audience while keeping the required effort to an absolute minimum. We analyze MACH as a case study of how requirements and constraints in an academic...
Aycock, Dawn M; Clark, Patricia C; Thomas-Seaton, LaTeshia; Lee, Shih-Yu; Moloney, Margaret
Highly organized project management facilitates rigorous study implementation. Research involves gathering large amounts of information that can be overwhelming when organizational strategies are not used. We describe a variety of project management and organizational tools used in different studies that may be particularly useful for novice researchers. The studies were a multisite study of caregivers of stroke survivors, an Internet-based diary study of women with migraines, and a pilot study testing a sleep intervention in mothers of low-birth-weight infants. Project management tools were used to facilitate enrollment, data collection, and access to results. The tools included protocol and eligibility checklists, event calendars, screening and enrollment logs, instrument scoring tables, and data summary sheets. These tools created efficiency, promoted a positive image, minimized errors, and provided researchers with a sense of control. For the studies described, there were no protocol violations, there were minimal missing data, and the integrity of data collection was maintained. © The Author(s) 2016.
Full Text Available Purpose: The aim of this study is to investigate the importance of Knowledge Management as a tool for improving business processes in a different context from the industrial organizations, as an archaeological museum. Design/methodology/approach: Using data collected from the National Museum of the Sultanate of Oman in Muscat, a methodology for analysis and improvement of processes (the Business Cycle Management Process, CMP is designed and validated. This application is described as an eight phases process based on Six Sigma DMAIC. The model has a characteristic "P" shape. Findings: As the results obtained by the process improvement initiative show, we highlight the relevance of the improvement in all aspects regarding the security in showcases in that context. Research limitations/implications: The complexity of implementing indicators and the partial vision of the project as data were only obtained from a part of one of the companies involved in the construction of the museum. An important implication of this paper is in order to present a methodology to improve the museum processes focusing on the reduction of errors and also adding value for the visitors. Practical implications: The relevance to intervene on certain relevant variables at different levels of management performance is verified. Social implications: Improving the quality of leisure services in order to the identification of certain challenges regarding the nature and competitiveness of cultural services. Originality/value: The current work has served as a repository of knowledge applicable to new similar projects, in which to take into account the peculiarities of each case and in particular the level of quality demanded by the client in a cultural context. It is important to take into account the degree of avoidable dissatisfaction (number of solvable problems that would lead to dissatisfaction, the opportunity for improvement, the reduction of operational waste and the need
Full Text Available Participatory-action research encourages the involvement of all key stakeholders in the research process and is especially well suited to mental health research. Previous literature outlines the importance of engaging stakeholders in the development of research questions and methodologies, but little has been written about ensuring the involvement of all stakeholders (especially non-academic members in dissemination opportunities such as publication development. The Article Idea Chart was developed as a specific methodology for engaging all stakeholders in data analysis and publication development. It has been successfully utilised in a number of studies and is an effective tool for ensuring the dissemination process of participatory-action research results is both inclusive and transparent to all team members, regardless of stakeholder group. Keywords: participatory-action research, mental health, dissemination, community capacity building, publications, authorship
Anna Kirova PhD
Full Text Available In this article the authors explore the effect of word-image relationships on the collection of data and the reporting of research results for a study involving the development of a series of fotonovelas with immigrant children in an inner-city school. The central question explored in this article is Can experiences such as producing visual narratives in the form of fotonovelas stimulate multiple expressions of voice and position and bring awareness of embodied ways of communicating in a culture-rich school context? The processes involved in collaboratively developing the photographic narrative format of the fotonovela combine visual elements and structures and embodied, reflective performance together with written text. As a research method fotonovela does not merely translate verbal into visual representations but constructs a hybrid photo-image-text that opens new spaces for dialogue, resistance, and representation of a new way of knowing that changes the way of seeing and has the potential to change the author's and the reader's self-understanding.
Luo, Zhonghui; Peng, Bin; Xiao, Qijun; Bai, Lu
Thermal error is the main factor affecting the accuracy of precision machining. Through experiments, this paper studies the thermal error test and intelligent modeling for the spindle of vertical high speed CNC machine tools in respect of current research focuses on thermal error of machine tool. Several testing devices for thermal error are designed, of which 7 temperature sensors are used to measure the temperature of machine tool spindle system and 2 displacement sensors are used to detect the thermal error displacement. A thermal error compensation model, which has a good ability in inversion prediction, is established by applying the principal component analysis technology, optimizing the temperature measuring points, extracting the characteristic values closely associated with the thermal error displacement, and using the artificial neural network technology.
Pazos, Florencio; Chagoyen, Monica
Daily work in molecular biology presently depends on a large number of computational tools. An in-depth, large-scale study of that 'ecosystem' of Web tools, its characteristics, interconnectivity, patterns of usage/citation, temporal evolution and rate of decay is crucial for understanding the forces that shape it and for informing initiatives aimed at its funding, long-term maintenance and improvement. In particular, the long-term maintenance of these tools is compromised because of their specific development model. Hundreds of published studies become irreproducible de facto, as the software tools used to conduct them become unavailable. In this study, we present a large-scale survey of >5400 publications describing Web servers within the two main bibliographic resources for disseminating new software developments in molecular biology. For all these servers, we studied their citation patterns, the subjects they address, their citation networks and the temporal evolution of these factors. We also analysed how these factors affect the availability of these servers (whether they are alive). Our results show that this ecosystem of tools is highly interconnected and adapts to the 'trendy' subjects in every moment. The servers present characteristic temporal patterns of citation/usage, and there is a worrying rate of server 'death', which is influenced by factors such as the server popularity and the institutions that hosts it. These results can inform initiatives aimed at the long-term maintenance of these resources. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: email@example.com.
Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D'Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel
International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants' comprehension of the study information was measured by using a validated digitised audio questionnaire. The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants' 'recall' and 'understanding' between first and second visits were statistically significant (F (1,41)=25.38, presearch is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings.
Seltzer, Erica D.; Stolley, Melinda R.; Mensah, Edward K.; Sharp, Lisa K.
Purpose The recent and rapid growth of social networking site (SNS) use presents a unique public health opportunity to develop effective strategies for the recruitment of hard-to-reach participants for cancer research studies. This survey investigated childhood cancer survivors’ reported use of SNS such as facebook or MySpace and their perceptions of using SNS, for recruitment into survivorship research. Methods Sixty White, Black and Hispanic, adult childhood cancer survivors (range 18 – 48 years of age) that were randomly selected from a larger childhood cancer study, the Chicago Healthy Living Study (CHLS), participated in this pilot survey. Telephone surveys were conducted to understand current SNS activity and attitudes towards using SNS as a cancer research recruitment tool. Results Seventy percent of participants reported SNS usage of which 80% were at least weekly users and 79 % reported positive attitudes towards the use of SNS as a recruitment tool for survivorship research. Conclusions and implications for cancer survivors The results of this pilot study revealed that SNS use was high and regular among the childhood cancer survivors sampled. Most had positive attitudes towards using SNS for recruitment of research. The results of this pilot survey suggest that SNS may offer an alternative approach for recruitment of childhood cancer survivors into research. PMID:24532046
Seltzer, Erica D; Stolley, Melinda R; Mensah, Edward K; Sharp, Lisa K
The recent and rapid growth of social networking site (SNS) use presents a unique public health opportunity to develop effective strategies for the recruitment of hard-to-reach participants for cancer research studies. This survey investigated childhood cancer survivors' reported use of SNS such as Facebook or MySpace and their perceptions of using SNS, for recruitment into survivorship research. Sixty White, Black, and Hispanic adult childhood cancer survivors (range 18-48 years of age) that were randomly selected from a larger childhood cancer study, the Chicago Healthy Living Study, participated in this pilot survey. Telephone surveys were conducted to understand current SNS activity and attitudes towards using SNS as a cancer research recruitment tool. Seventy percent of participants reported SNS usage of which 80 % were at least weekly users and 79 % reported positive attitudes towards the use of SNS as a recruitment tool for survivorship research. The results of this pilot study revealed that SNS use was high and regular among the childhood cancer survivors sampled. Most had positive attitudes towards using SNS for recruitment of research. The results of this pilot survey suggest that SNS may offer an alternative approach for recruitment of childhood cancer survivors into research.
Costa, Fabricio F
Advances in information technology have improved our ability to gather, collect and analyze information from individuals online. Social networks can be seen as a nonlinear superposition of a multitude of complex connections between people where the nodes represent individuals and the links between them capture a variety of different social interactions. The emergence of different types of social networks has fostered connections between individuals, thus facilitating data exchange in a variety of fields. Therefore, the question posed now is "can these same tools be applied to life sciences in order to improve scientific and medical research?" In this article, I will review how social networks and other web-based tools are changing the way we approach and track diseases in biomedical research. Copyright © 2012 Elsevier Ltd. All rights reserved.
Murphy, Paul V; André, Sabine; Gabius, Hans-Joachim
Coding of biological information is not confined to nucleic acids and proteins. Endowed with the highest level of structural versatility among biomolecules, the glycan chains of cellular glycoconjugates are well-suited to generate molecular messages/signals in a minimum of space. The sequence and shape of oligosaccharides as well as spatial aspects of multivalent presentation are assumed to underlie the natural specificity/selectivity that cellular glycans have for endogenous lectins. In order to eventually unravel structure-activity profiles cyclic scaffolds have been used as platforms to produce glycoclusters and afford valuable tools. Using adhesion/growth-regulatory galectins and the pan-galectin ligand lactose as a model, emerging insights into the potential of cyclodextrins, cyclic peptides, calixarenes and glycophanes for this purpose are presented herein. The systematic testing of lectin panels with spatially defined ligand presentations can be considered as a biomimetic means to help clarify the mechanisms, which lead to the exquisite accuracy at which endogenous lectins select their physiological counterreceptors from the complexity of the cellular glycome.
Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D’Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel
Background International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. Objectives This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. Methods We developed the informed consent document of the malaria treatment trial into a m...
Strasser, Carly; Kunze, John; Abrams, Stephen; Cruse, Patricia
Scientific datasets have immeasurable value, but they lose their value over time without proper documentation, long-term storage, and easy discovery and access. Across disciplines as diverse as astronomy, demography, archeology, and ecology, large numbers of small heterogeneous datasets (i.e., the long tail of data) are especially at risk unless they are properly documented, saved, and shared. One unifying factor for many of these at-risk datasets is that they reside in spreadsheets. In response to this need, the California Digital Library (CDL) partnered with Microsoft Research Connections and the Gordon and Betty Moore Foundation to create the DataUp data management tool for Microsoft Excel. Many researchers creating these small, heterogeneous datasets use Excel at some point in their data collection and analysis workflow, so we were interested in developing a data management tool that fits easily into those work flows and minimizes the learning curve for researchers. The DataUp project began in August 2011. We first formally assessed the needs of researchers by conducting surveys and interviews of our target research groups: earth, environmental, and ecological scientists. We found that, on average, researchers had very poor data management practices, were not aware of data centers or metadata standards, and did not understand the benefits of data management or sharing. Based on our survey results, we composed a list of desirable components and requirements and solicited feedback from the community to prioritize potential features of the DataUp tool. These requirements were then relayed to the software developers, and DataUp was successfully launched in October 2012.
Rosenbaum, Benjamin P; Silkin, Nikolay; Miller, Randolph A
Real-time alerting systems typically warn providers about abnormal laboratory results or medication interactions. For more complex tasks, institutions create site-wide 'data warehouses' to support quality audits and longitudinal research. Sophisticated systems like i2b2 or Stanford's STRIDE utilize data warehouses to identify cohorts for research and quality monitoring. However, substantial resources are required to install and maintain such systems. For more modest goals, an organization desiring merely to identify patients with 'isolation' orders, or to determine patients' eligibility for clinical trials, may adopt a simpler, limited approach based on processing the output of one clinical system, and not a data warehouse. We describe a limited, order-entry-based, real-time 'pick off' tool, utilizing public domain software (PHP, MySQL). Through a web interface the tool assists users in constructing complex order-related queries and auto-generates corresponding database queries that can be executed at recurring intervals. We describe successful application of the tool for research and quality monitoring.
One of the main requirements in agricultural research is to analyse large number of samples for their one or more chemical constituents and physical properties. In plant breeding programmes and germplasm evaluation, it is necessary that the analysis is fast as many samples are to be analysed. Pulsed nuclear magnetic resonance (NMR) is a potential tool for developing rapid and nondestructive method of analysis. Various applications of low resolution pulsed NMR in agricultural research, which are generally used as screening method are briefly described. 25 refs., 2 figs., 2 tabs
The GLARE: Grease Lubrication Apparatus for Research and Education was designed as a fourth year thesis project with the University of Ontario Institute of Technology (UOIT). The purpose of the apparatus is to train Ontario Power Generation Nuclear (OPGN) staff to properly lubricate bearings with grease and to help detect early equipment failures. Proper re-lubrication is critical to the nuclear industry as equipment may be inaccessible for long periods of time. A secondary purpose for the tool is for UOIT research and undergraduate laboratories.This abstract provides an overview of the project and its application to the nuclear industry. (author)
Daim, Tugrul; Kim, Jisun
Technologies such as renewable energy alternatives including wind, solar and biomass, storage technologies and electric engines are creating a different landscape for the electricity industry. Using sources and ideas from technologies such as renewable energy alternatives, Research and Technology Management in the Electricity Industry explores a different landscape for this industry and applies it to the electric industry supported by real industry cases. Divided into three sections, Research and Technology Management in the Electricity Industry introduces a range of methods and tools includ
Environmental diagnostic methodology of hydrographic basins by using geoprocessing tools aiming the planning and management of water resources; Metodologia de diagnostico ambiental de bacias hidrograficas utilizando ferramentas de geoprocessamento visando o planejamento e o gerenciamento de recursos hidricos
Aquino, Luiz Carlos Servulo de; Primo, Paulo Bidegain da Silveira; Vieira, Hermani de Moraes; Lacorte, Ana Castro; Menezes, Paulo Cesar Pires; Andrade, Hugo de Souza [UNESCO, Rio de Janeiro, RJ (Brazil)]. E-mails: firstname.lastname@example.org; email@example.com; firstname.lastname@example.org; email@example.com; firstname.lastname@example.org; email@example.com
This work presents the methodology developed for the research named 'Synopsis of the South Atlantic Basins - East and Southeast Intervals', with two specific objectives: helping the technicians of the Water National Agency (WNA), in the planing process of the decision making, and providing subsides for the National Policy of Hydric Resources Management implantation, The main methodology product is the systematization of the social-environmental, hydrological and hydro energetic data, and the elaboration of thematic maps synthesising bibliographic and cartographic information coming from various administrative circles and each studied hydro graphic basin.
Effective mentoring is a critical component in the training of early-career researchers, cultivating more independent, productive and satisfied scientists. For example, mentoring has been shown by the 2005 Sigma Xi National Postdoc Survey to be a key indicator for a successful postdoctoral outcome. Mentoring takes many forms and can include support for maximizing research skills and productivity as well as assistance in preparing for a chosen career path. Yet, because there is no "one-size-fits-all” approach, mentoring can be an activity that is hard to define. In this presentation, a series of tips and tools will be offered to aid mentors in developing a plan for their mentoring activities. This will include: suggestions for how to get started; opportunities for mentoring activities within the research group, within the institution, and outside the institution; tools for communicating and assessing professional milestones; and resources for fostering the professional and career development of mentees. Special considerations will also be presented for mentoring international scholars and women. These strategies will be helpful to the PI responding to the new NSF mentoring plan requirement for postdocs as well as to the student, postdoc, researcher or professor overseeing the research and training of others.
A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.
Sarah E. Council
Full Text Available The field of citizen science is exploding and offers not only a great way to engage the general public in science literacy through primary research, but also an avenue for teaching professionals to engage their students in meaningful community research experiences. Though this field is expanding, there are many hurdles for researchers and participants, as well as challenges for teaching professionals who want to engage their students. Here we highlight one of our projects that engaged many citizens in Raleigh, NC, and across the world, and we use this as a case study to highlight ways to engage citizens in all kinds of research. Through the use of numerous tools to engage the public, we gathered citizen scientists to study skin microbes and their associated odors, and we offer valuable ideas for teachers to tap into resources for their own students and potential citizen-science projects.
Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001-2014. A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion ("consultation process") but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face-to-face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. The number of priority setting exercises in health research published in PubMed-indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well-defined structure - such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix - it is likely that the Delphi method and non-replicable consultation processes will gradually be replaced by these emerging tools, which offer more
Background Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. Methods To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001–2014. Results A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion (“consultation process”) but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face–to–face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. Conclusion The number of priority setting exercises in health research published in PubMed–indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well–defined structure – such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix – it is likely that the Delphi method and non–replicable consultation processes will gradually be
Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.
Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software
Wahle, Andreas; Lee, Kyungmoo; Harding, Adam T.; Garvin, Mona K.; Niemeijer, Meindert; Sonka, Milan; Abràmoff, Michael D.
In ophthalmology, various modalities and tests are utilized to obtain vital information on the eye's structure and function. For example, optical coherence tomography (OCT) is utilized to diagnose, screen, and aid treatment of eye diseases like macular degeneration or glaucoma. Such data are complemented by photographic retinal fundus images and functional tests on the visual field. DICOM isn't widely used yet, though, and frequently images are encoded in proprietary formats. The eXtensible Neuroimaging Archive Tool (XNAT) is an open-source NIH-funded framework for research PACS and is in use at the University of Iowa for neurological research applications. Its use for ophthalmology was hence desirable but posed new challenges due to data types thus far not considered and the lack of standardized formats. We developed custom tools for data types not natively recognized by XNAT itself using XNAT's low-level REST API. Vendor-provided tools can be included as necessary to convert proprietary data sets into valid DICOM. Clients can access the data in a standardized format while still retaining the original format if needed by specific analysis tools. With respective project-specific permissions, results like segmentations or quantitative evaluations can be stored as additional resources to previously uploaded datasets. Applications can use our abstract-level Python or C/C++ API to communicate with the XNAT instance. This paper describes concepts and details of the designed upload script templates, which can be customized to the needs of specific projects, and the novel client-side communication API which allows integration into new or existing research applications.
Guertin, L. A.
VoiceThread has been utilized in an undergraduate research methods course for peer review and final research project dissemination. VoiceThread (http://www.voicethread.com) can be considered a social media tool, as it is a web-based technology with the capacity to enable interactive dialogue. VoiceThread is an application that allows a user to place a media collection online containing images, audio, videos, documents, and/or presentations in an interface that facilitates asynchronous communication. Participants in a VoiceThread can be passive viewers of the online content or engaged commenters via text, audio, video, with slide annotations via a doodle tool. The VoiceThread, which runs across browsers and operating systems, can be public or private for viewing and commenting and can be embedded into any website. Although few university students are aware of the VoiceThread platform (only 10% of the students surveyed by Ng (2012)), the 2009 K-12 edition of The Horizon Report (Johnson et al., 2009) lists VoiceThread as a tool to watch because of the opportunities it provides as a collaborative learning environment. In Fall 2011, eleven students enrolled in an undergraduate research methods course at Penn State Brandywine each conducted their own small-scale research project. Upon conclusion of the projects, students were required to create a poster summarizing their work for peer review. To facilitate the peer review process outside of class, each student-created PowerPoint file was placed in a VoiceThread with private access to only the class members and instructor. Each student was assigned to peer review five different student posters (i.e., VoiceThread images) with the audio and doodle tools to comment on formatting, clarity of content, etc. After the peer reviews were complete, the students were allowed to edit their PowerPoint poster files for a new VoiceThread. In the new VoiceThread, students were required to video record themselves describing their research
Full Text Available Assembly is the part that produces the maximum workload and consumed time during product design and manufacturing process. CNC machine tool is the key basic equipment in manufacturing industry and research on assembly design technologies of CNC machine tool has theoretical significance and practical value. This study established a simplified ASRG for CNC machine tool. The connection between parts, semantic information of transmission, and geometric constraint information were quantified to assembly connection strength to depict the assembling difficulty level. The transmissibility based on trust relationship was applied on the assembly connection strength. Assembly unit partition based on assembly connection strength was conducted, and interferential assembly units were identified and revised. The assembly sequence planning and optimization of parts in each assembly unit and between assembly units was conducted using genetic algorithm. With certain type of high speed CNC turning center, as an example, this paper explored into the assembly modeling, assembly unit partition, and assembly sequence planning and optimization and realized the optimized assembly sequence of headstock of CNC machine tool.
Full Text Available The multi-disciplinary and international nature of large European projects requires powerful managerial and communicative tools to ensure the transmission of information to the end-users. One such project is TRACE entitled “Tracing Food Commodities in Europe”. One of its objectives is to provide a communication system dedicated to be the central source of information on food authenticity and traceability in Europe. This paper explores the web tools used and communication vehicles offered to scientists involved in the TRACE project to communicate internally as well as to the public. Two main tools have been built: an Intranet and a public website. The TRACE website can be accessed at http://www.trace.eu.org. A particular emphasis was placed on the efficiency, the relevance and the accessibility of the information, the publicity of the website as well as the use of the collaborative utilities. The rationale of web space design as well as integration of proprietary software solutions are presented. Perspectives on the using of web tools in the research projects are discussed.
Johnson, Emilie K; Broder-Fingert, Sarabeth; Tanpowpong, Pornthep; Bickel, Jonathan; Lightdale, Jenifer R; Nelson, Caleb P
A major aim of the i2b2 (informatics for integrating biology and the bedside) clinical data informatics framework aims to create an efficient structure within which patients can be identified for clinical and translational research projects.Our objective was to describe the respective roles of the i2b2 research query tool and the electronic medical record (EMR) in conducting a case-controlled clinical study at our institution. We analyzed the process of using i2b2 and the EMR together to generate a complete research database for a case-control study that sought to examine risk factors for kidney stones among gastrostomy tube (G-tube) fed children. Our final case cohort consisted of 41/177 (23%) of potential cases initially identified by i2b2, who were matched with 80/486 (17%) of potential controls. Cases were 10 times more likely to be excluded for inaccurate coding regarding stones vs. inaccurate coding regarding G-tubes. A majority (67%) of cases were excluded due to not meeting clinical inclusion criteria, whereas a majority of control exclusions (72%) occurred due to inadequate clinical data necessary for study completion. Full dataset assembly required complementary information from i2b2 and the EMR. i2b2 was critical as a query analysis tool for patient identification in our case-control study. Patient identification via procedural coding appeared more accurate compared with diagnosis coding. Completion of our investigation required iterative interplay of i2b2 and the EMR to assemble the study cohort.
Argyropoulou, Eleftheria; Hatira, Kalliopi
This article introduces an alternative qualitative research tool: metaphor and drawing, as projections of personality features, to explore underlying concepts and values, thoughts and beliefs, fears and hesitations, aspirations and ambitions of the research subjects. These two projective tools are used to explore Greek state kindergarten head…
Rodney R. Dietert
Full Text Available Academic preparation of science researchers and/or human or veterinary medicine clinicians through the science, technology, engineering, and mathematics (STEM curriculum has usually focused on the students (1 acquiring increased disciplinary expertise, (2 learning needed methodologies and protocols, and (3 expanding their capacity for intense, persistent focus. Such educational training is effective until roadblocks or problems arise via this highly-learned approach. Then, the health science trainee may have few tools available for effective problem solving. Training to achieve flexibility, adaptability, and broadened perspectives using contemplative practices has been rare among biomedical education programs. To address this gap, a Cornell University-based program involving formal biomedical science coursework, and health science workshops has been developed to offer science students, researchers and health professionals a broader array of personal, contemplation-based, problem-solving tools. This STEM educational initiative includes first-person exercises designed to broaden perceptional awareness, decrease emotional drama, and mobilize whole-body strategies for creative problem solving. Self-calibration and journaling are used for students to evaluate the personal utility of each exercise. The educational goals are to increase student self-awareness and self-regulation and to provide trainees with value-added tools for career-long problem solving. Basic elements of this educational initiative are discussed using the framework of the Tree of Contemplative Practices.
Dubosarsky, Mia D.
How do young children view science? Do these views reflect cultural stereotypes? When do these views develop? These fundamental questions in the field of science education have rarely been studied with the population of preschool children. One main reason is the lack of an appropriate research instrument that addresses preschool children's developmental competencies. Extensive body of research has pointed at the significance of early childhood experiences in developing positive attitudes and interests toward learning in general and the learning of science in particular. Theoretical and empirical research suggests that stereotypical views of science may be replaced by authentic views following inquiry science experience. However, no preschool science intervention program could be designed without a reliable instrument that provides baseline information about preschool children's current views of science. The current study presents preschool children's views of science as gathered from a pioneering research tool. This tool, in the form of a computer "game," does not require reading, writing, or expressive language skills and is operated by the children. The program engages children in several simple tasks involving picture recognition and yes/no answers in order to reveal their views about science. The study was conducted with 120 preschool children in two phases and found that by the age of 4 years, participants possess an emergent concept of science. Gender and school differences were detected. Findings from this interdisciplinary study will contribute to the fields of early childhood, science education, learning technologies, program evaluation, and early childhood curriculum development.
Full Text Available The paper describes the research and development of casting and solidification of slab ingots from special tool steels by means of numerical modelling using the finite element method. The pre-processing, processing and post-processing phases of numerical modelling are outlined. Also, problems with determining the thermophysical properties of materials and heat transfer between the individual parts of the casting system are discussed. Based on the type of grade of tool steel, the risk of final porosity is predicted. The results allowed to improve the production technology of slab ingots, and also to verify the ratio, the chamfer and the external/ internal shape of the wall of the new designed slab ingots.
We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Over the last 60 years research reactors (RRs) have played an important role in technological and socio-economical development of mankind, such as radioisotope production for medicine, industry, research and education. Neutron scattering has been widely used for research and development in materials science. The prospect of neutron scattering as a powerful tool for materials research is increasing in the 21 st century. This can be seen from the investment of several new neutron sources all over the world such as the Spallation Neutron Source (SNS) in USA, the Japan Proton Accelerator Complex (JPARC) in Japan, the new OPAL Reactor in Australia, and some upgrading to the existing sources at ISIS, Rutherford Appleton Laboratory, UK; Institute of Laue Langevin (ILL) in Grenoble, France and Berlin Reactor, Germany. Developing countries with moderate flux research reactor have also been involved in this technique, such as India, Malaysia and Indonesia The Siwabessy Multipurpose Reactor in Serpong, Indonesia that also produces thermal neutron has contributed to the research and development in the Asia Pacific Region. However,the international joint research among those countries plays an important role on optimizing the results. (author)
Full Text Available Over the last 60 years research reactors (RRs have played an important role in technological and socio-economical development of mankind, such as radioisotope production for medicine, industry, research and education. Neutron scattering has been widely used for research and development in materials science. The prospect of neutron scattering as a powerful tool for materials research is increasing in the 21st century. This can be seen from the investment of several new neutron sources all over the world such as the Spallation Neutron Source (SNS in USA, the Japan Proton Accelerator Complex (JPARC in Japan, the new OPAL Reactor in Australia, and some upgrading to the existing sources at ISIS, Rutherford Appleton Laboratory, UK; Institute of Laue Langevin (ILL in Grenoble, France and Berlin Reactor, Germany. Developing countries with moderate flux research reactor have also been involved in this technique, such as India, Malaysia and Indonesia. The Siwabessy Multipurpose Reactor in Serpong, Indonesia that also produces thermal neutron has contributed to the research and development in the Asia Pacific Region. However, the international joint research among those countries plays an important role on optimizing the results.
Afolabi, Muhammed Olanrewaju; McGrath, Nuala; D'Alessandro, Umberto; Kampmann, Beate; Imoukhuede, Egeruan B; Ravinetto, Raffaella M; Alexander, Neal; Larson, Heidi J; Chandramohan, Daniel; Bojang, Kalifa
To assess the effectiveness of a multimedia informed consent tool for adults participating in a clinical trial in the Gambia. Adults eligible for inclusion in a malaria treatment trial (n = 311) were randomized to receive information needed for informed consent using either a multimedia tool (intervention arm) or a standard procedure (control arm). A computerized, audio questionnaire was used to assess participants' comprehension of informed consent. This was done immediately after consent had been obtained (at day 0) and at subsequent follow-up visits (days 7, 14, 21 and 28). The acceptability and ease of use of the multimedia tool were assessed in focus groups. On day 0, the median comprehension score in the intervention arm was 64% compared with 40% in the control arm (P = 0.042). The difference remained significant at all follow-up visits. Poorer comprehension was independently associated with female sex (odds ratio, OR: 0.29; 95% confidence interval, CI: 0.12-0.70) and residing in Jahaly rather than Basse province (OR: 0.33; 95% CI: 0.13-0.82). There was no significant independent association with educational level. The risk that a participant's comprehension score would drop to half of the initial value was lower in the intervention arm (hazard ratio 0.22, 95% CI: 0.16-0.31). Overall, 70% (42/60) of focus group participants from the intervention arm found the multimedia tool clear and easy to understand. A multimedia informed consent tool significantly improved comprehension and retention of consent information by research participants with low levels of literacy.
Full Text Available The UbuntuNet Alliance Alliance is well-placed to facilitate interaction between education and research institutions and the African academic and researcher in the Diaspora so that together they can strengthen research that will exploit new technological tools and increase the industrial base. It is envisaged that the Alliance will become an important vehicle for linkages that will facilitate repatriation of scientific knowledge and skills to Africa and even help reduce and eventually eradicate the brain drain which has taken so many excellent intellectuals to the developed world. As organisational vehicles for inter-institutional collaboration both established and emerging NRENs can play a critical role in reversing these trends and in mitigating what appears to be the negative impact of the brain drain.
Darling, John A.; Frederick, Raymond M.
Understanding the risks of biological invasion posed by ballast water-whether in the context of compliance testing, routine monitoring, or basic research-is fundamentally an exercise in biodiversity assessment, and as such should take advantage of the best tools available for tackling that problem. The past several decades have seen growing application of genetic methods for the study of biodiversity, driven in large part by dramatic technological advances in nucleic acids analysis. Monitoring approaches based on such methods have the potential to increase dramatically sampling throughput for biodiversity assessments, and to improve on the sensitivity, specificity, and taxonomic accuracy of traditional approaches. The application of targeted detection tools (largely focused on PCR but increasingly incorporating novel probe-based methodologies) has led to a paradigm shift in rare species monitoring, and such tools have already been applied for early detection in the context of ballast water surveillance. Rapid improvements in community profiling approaches based on high throughput sequencing (HTS) could similarly impact broader efforts to catalogue biodiversity present in ballast tanks, and could provide novel opportunities to better understand the risks of biotic exchange posed by ballast water transport-and the effectiveness of attempts to mitigate those risks. These various approaches still face considerable challenges to effective implementation, depending on particular management or research needs. Compliance testing, for instance, remains dependent on accurate quantification of viable target organisms; while tools based on RNA detection show promise in this context, the demands of such testing require considerable additional investment in methods development. In general surveillance and research contexts, both targeted and community-based approaches are still limited by various factors: quantification remains a challenge (especially for taxa in larger size
Montano, Blanca San José; Garcia Carretero, Rafael; Varela Entrecanales, Manuel; Pozuelo, Paz Martin
Research in hospital settings faces several difficulties. Information technologies and certain Web 2.0 tools may provide new models to tackle these problems, allowing for a collaborative approach and bridging the gap between clinical practice, teaching and research. We aim to gather a community of researchers involved in the development of a network of learning and investigation resources in a hospital setting. A multi-disciplinary work group analysed the needs of the research community. We studied the opportunities provided by Web 2.0 tools and finally we defined the spaces that would be developed, describing their elements, members and different access levels. WIKINVESTIGACION is a collaborative web space with the aim of integrating the management of all the hospital's teaching and research resources. It is composed of five spaces, with different access privileges. The spaces are: Research Group Space 'wiki for each individual research group', Learning Resources Centre devoted to the Library, News Space, Forum and Repositories. The Internet, and most notably the Web 2.0 movement, is introducing some overwhelming changes in our society. Research and teaching in the hospital setting will join this current and take advantage of these tools to socialise and improve knowledge management.
Sade, Christian; de Barros, Leticia Maria Renault; Melo, Jorge José Maciel; Passos, Eduardo
This paper seeks to assess a way of conducting interviews in line with the ideology of Brazilian Psychiatric Reform. In the methodology of participative intervention and research in mental health, the interview is less a data collection than a data harvesting procedure. It is designed to apply the principles of psychosocial care, autonomy as the basis for treatment, the predominance of the users and of their social networks and civic participation. Inspired by the Explicitation Interview technique, the contention is that the handling of the interview presupposes an open attitude able to promote and embrace different viewpoints. This attitude makes the interview a collective experience of sharing and belonging, allowing participants to reposition themselves subjectively in treatment with the emergence of groupality. As an example of using the interview as a methodological tool in mental health research, we examine research into adaptation of the tool of Autonomous Medication Management (GAM). It is an interventionist approach guided by principles that foster autonomy and the protagonist status of users of psychotropic medication, their quality of life, their rights and recognition of the multiple significances of medication, understood here as a collective interview technique.
Maimon, Eric; Samuni, Uri; Goldstein, Sara
Radicals are part of the chemistry of life, and ionizing radiation chemistry serves as an indispensable research tool for elucidation of the mechanism(s) underlying their reactions. The ever-increasing understanding of their involvement in diverse physiological and pathological processes has expanded the search for compounds that can diminish radical-induced damage. This review surveys the areas of research focusing on radical reactions and particularly with stable cyclic nitroxide radicals, which demonstrate unique antioxidative activities. Unlike common antioxidants that are progressively depleted under oxidative stress and yield secondary radicals, nitroxides are efficient radical scavengers yielding in most cases their respective oxoammonium cations, which are readily reduced back in the tissue to the nitroxide thus continuously being recycled. Nitroxides, which not only protect enzymes, cells, and laboratory animals from diverse kinds of biological injury, but also modify the catalytic activity of heme enzymes, could be utilized in chemical and biological systems serving as a research tool for elucidating mechanisms underlying complex chemical and biochemical processes.
Horvath, Monica M; Winfield, Stephanie; Evans, Steve; Slopek, Steve; Shang, Howard; Ferranti, Jeffrey
In many healthcare organizations, comparative effectiveness research and quality improvement (QI) investigations are hampered by a lack of access to data created as a byproduct of patient care. Data collection often hinges upon either manual chart review or ad hoc requests to technical experts who support legacy clinical systems. In order to facilitate this needed capacity for data exploration at our institution (Duke University Health System), we have designed and deployed a robust Web application for cohort identification and data extraction--the Duke Enterprise Data Unified Content Explorer (DEDUCE). DEDUCE is envisioned as a simple, web-based environment that allows investigators access to administrative, financial, and clinical information generated during patient care. By using business intelligence tools to create a view into Duke Medicine's enterprise data warehouse, DEDUCE provides a Guided Query functionality using a wizard-like interface that lets users filter through millions of clinical records, explore aggregate reports, and, export extracts. Researchers and QI specialists can obtain detailed patient- and observation-level extracts without needing to understand structured query language or the underlying database model. Developers designing such tools must devote sufficient training and develop application safeguards to ensure that patient-centered clinical researchers understand when observation-level extracts should be used. This may mitigate the risk of data being misunderstood and consequently used in an improper fashion. Copyright © 2010 Elsevier Inc. All rights reserved.
Tomás, Concepción; Yago, Teresa; Eguiluz, Mercedes; Samitier, M A Luisa; Oliveros, Teresa; Palacios, Gemma
To validate the questionnaire "Gender Perspective in Health Research" (GPIHR) to assess the inclusion of gender perspective in research projects. Validation study in two stages. Feasibility was analysed in the first, and reliability, internal consistence and validity in the second. Aragón Institute of Health Science, Aragón, Spain. GPIHR was applied to 118 research projects funded in national and international competitive tenders from 2003 to 2012. Analysis of inter- and intra-observer reliability with Kappa index and internal consistency with Cronbach's alpha. Content validity analysed through literature review and construct validity with an exploratory factor analysis. Validated GPIHR has 10 questions: 3 in the introduction, 1 for objectives, 3 for methodology and 3 for research purpose. Average time of application was 13min Inter-observer reliability (Kappa) varied between 0.35 and 0.94 and intra-observer between 0.40 and 0.94. Theoretical construct is supported in the literature. Factor analysis identifies three levels of GP inclusion: "difference by sex", "gender sensitive" and "feminist research" with an internal consistency of 0.64, 0.87 and 0.81, respectively, which explain 74.78% of variance. GPIHR questionnaire is a valid tool to assess GP and useful for those researchers who would like to include GP in their projects. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.
Milani, Alessandra; Mazzocco, Ketti; Stucchi, Sara; Magon, Giorgio; Pravettoni, Gabriella; Passoni, Claudia; Ciccarelli, Chiara; Tonali, Alessandra; Profeta, Teresa; Saiani, Luisa
Few resources are available to quantify clinical trial-associated workload, needed to guide staffing and budgetary planning. The aim of the study is to describe a tool to measure clinical trials nurses' workload expressed in time spent to complete core activities. Clinical trials nurses drew up a list of nursing core activities, integrating results from literature searches with personal experience. The final 30 core activities were timed for each research nurse by an outside observer during daily practice in May and June 2014. Average times spent by nurses for each activity were calculated. The "Nursing Time Required by Clinical Trial-Assessment Tool" was created as an electronic sheet that combines the average times per specified activities and mathematic functions to return the total estimated time required by a research nurse for each specific trial. The tool was tested retrospectively on 141 clinical trials. The increasing complexity of clinical research requires structured approaches to determine workforce requirements. This study provides a tool to describe the activities of a clinical trials nurse and to estimate the associated time required to deliver individual trials. The application of the proposed tool in clinical research practice could provide a consistent structure for clinical trials nursing workload estimation internationally. © 2016 John Wiley & Sons Australia, Ltd.
Lienert, Florian; Lohmueller, Jason J; Garg, Abhishek; Silver, Pamela A
Recent progress in DNA manipulation and gene circuit engineering has greatly improved our ability to programme and probe mammalian cell behaviour. These advances have led to a new generation of synthetic biology research tools and potential therapeutic applications. Programmable DNA-binding domains and RNA regulators are leading to unprecedented control of gene expression and elucidation of gene function. Rebuilding complex biological circuits such as T cell receptor signalling in isolation from their natural context has deepened our understanding of network motifs and signalling pathways. Synthetic biology is also leading to innovative therapeutic interventions based on cell-based therapies, protein drugs, vaccines and gene therapies. PMID:24434884
Combination of bioaffinity and chromatography gave birth to affinity chromatography. A further combination with frontal analysis resulted in creation of frontal affinity chromatography (FAC). This new versatile research tool enabled detailed analysis of weak interactions that play essential roles in living systems, especially those between complex saccharides and saccharide-binding proteins. FAC now becomes the best method for the investigation of saccharide-binding proteins (lectins) from viewpoints of sensitivity, accuracy, and efficiency, and is contributing greatly to the development of glycobiology. It opened a door leading to deeper understanding of the significance of saccharide recognition in life. The theory is also concisely described. PMID:25169774
S. P. Baranenko
Full Text Available Enterprise restructuring is aimed at adapting it to market conditions and improving its competitiveness through selection of most effective model of using material, technical, technological, organizational, commercial, economical, financial, tax-related and other resources with due account of the demand. Restructuring classification signs and types as well as restructuring aims specific for industrial enterprises are provided for.
Pigosso, Daniela Cristina Antelmi; McAloone, T. C.; Rozenfeld, H.
Ecodesign is a proactive management approach that integrates environmental considerations in product development and related processes (such as purchasing, marketing and research & development). Ecodesign aims to improve environmental performance of products throughout their life cycle, from raw...... material extraction and manufacturing to use and end-of-life. Over the last three decades, an intense development of new ecodesign methods and tools could be observed, but uptake by the industry remains a challenge. The purpose of this research is to perform a review of existing ecodesign tools and methods...... through a systematic literature review linked to bibliometric analyses, in order to explore the state of the art of ecodesign methods and tools and identify trends and opportunities in the field for the next decade....
Park, Sinyoung; Nam, Chung Mo; Park, Sejung; Noh, Yang Hee; Ahn, Cho Rong; Yu, Wan Sun; Kim, Bo Kyung; Kim, Seung Min; Kim, Jin Seok; Rha, Sun Young
With the growing amount of clinical research, regulations and research ethics are becoming more stringent. This trend introduces a need for quality assurance measures for ensuring adherence to research ethics and human research protection beyond Institutional Review Board approval. Audits, one of the most effective tools for assessing quality assurance, are measures used to evaluate Good Clinical Practice (GCP) and protocol compliance in clinical research. However, they are laborious, time consuming, and require expertise. Therefore, we developed a simple auditing process (a screening audit) and evaluated its feasibility and effectiveness. The screening audit was developed using a routine audit checklist based on the Severance Hospital's Human Research Protection Program policies and procedures. The measure includes 20 questions, and results are summarized in five categories of audit findings. We analyzed 462 studies that were reviewed by the Severance Hospital Human Research Protection Center between 2013 and 2017. We retrospectively analyzed research characteristics, reply rate, audit findings, associated factors and post-screening audit compliance, etc. RESULTS: Investigator reply rates gradually increased, except for the first year (73% → 26% → 53% → 49% → 55%). The studies were graded as "critical," "major," "minor," and "not a finding" (11.9, 39.0, 42.9, and 6.3%, respectively), based on findings and number of deficiencies. The auditors' decisions showed fair agreement with weighted kappa values of 0.316, 0.339, and 0.373. Low-risk level studies, single center studies, and non-phase clinical research showed more prevalent frequencies of being "major" or "critical" (p = 0.002, audit grade (p audit results of post-screening audit compliance checks in "non-responding" and "critical" studies upon applying the screening audit. Our screening audit is a simple and effective way to assess overall GCP compliance by institutions and to
Full Text Available This article describes the main features and implementation of our automatic data distribution research tool. The tool (DDT accepts programs written in Fortran 77 and generates High Performance Fortran (HPF directives to map arrays onto the memories of the processors and parallelize loops, and executable statements to remap these arrays. DDT works by identifying a set of computational phases (procedures and loops. The algorithm builds a search space of candidate solutions for these phases which is explored looking for the combination that minimizes the overall cost; this cost includes data movement cost and computation cost. The movement cost reflects the cost of accessing remote data during the execution of a phase and the remapping costs that have to be paid in order to execute the phase with the selected mapping. The computation cost includes the cost of executing a phase in parallel according to the selected mapping and the owner computes rule. The tool supports interprocedural analysis and uses control flow information to identify how phases are sequenced during the execution of the application.
Williams, Bradley S; D'Amico, Ellen; Kastens, Jude H; Thorp, James H; Flotemersch, Joseph E; Thoms, Martin C
River systems consist of hydrogeomorphic patches (HPs) that emerge at multiple spatiotemporal scales. Functional process zones (FPZs) are HPs that exist at the river valley scale and are important strata for framing whole-watershed research questions and management plans. Hierarchical classification procedures aid in HP identification by grouping sections of river based on their hydrogeomorphic character; however, collecting data required for such procedures with field-based methods is often impractical. We developed a set of GIS-based tools that facilitate rapid, low cost riverine landscape characterization and FPZ classification. Our tools, termed RESonate, consist of a custom toolbox designed for ESRI ArcGIS®. RESonate automatically extracts 13 hydrogeomorphic variables from readily available geospatial datasets and datasets derived from modeling procedures. An advanced 2D flood model, FLDPLN, designed for MATLAB® is used to determine valley morphology by systematically flooding river networks. When used in conjunction with other modeling procedures, RESonate and FLDPLN can assess the character of large river networks quickly and at very low costs. Here we describe tool and model functions in addition to their benefits, limitations, and applications.
Confocal microscopy is widely used in neurobiology for studying the three-dimensional structure of the nervous system. Confocal image data are often multi-channel, with each channel resulting from a different fluorescent dye or fluorescent protein; one channel may have dense data, while another has sparse; and there are often structures at several spatial scales: subneuronal domains, neurons, and large groups of neurons (brain regions). Even qualitative analysis can therefore require visualization using techniques and parameters fine-tuned to a particular dataset. Despite the plethora of volume rendering techniques that have been available for many years, the techniques standardly used in neurobiological research are somewhat rudimentary, such as looking at image slices or maximal intensity projections. Thus there is a real demand from neurobiologists, and biologists in general, for a flexible visualization tool that allows interactive visualization of multi-channel confocal data, with rapid fine-tuning of parameters to reveal the three-dimensional relationships of structures of interest. Together with neurobiologists, we have designed such a tool, choosing visualization methods to suit the characteristics of confocal data and a typical biologist\\'s workflow. We use interactive volume rendering with intuitive settings for multidimensional transfer functions, multiple render modes and multi-views for multi-channel volume data, and embedding of polygon data into volume data for rendering and editing. As an example, we apply this tool to visualize confocal microscopy datasets of the developing zebrafish visual system.
Background Currently over 50% of drugs prescribed to children have not been evaluated properly for use in their age group. One key reason why children have been excluded from clinical trials is that they are not considered able to exercise meaningful autonomy over the decision to participate. Dutch law states that competence to consent can be presumed present at the age of 12 and above; however, in pediatric practice children’s competence is not that clearly presented and the transition from assent to active consent is gradual. A gold standard for competence assessment in children does not exist. In this article we describe a study protocol on the development of a standardized tool for assessing competence to consent in research in children and adolescents. Methods/design In this study we modified the MacCAT-CR, the best evaluated competence assessment tool for adults, for use in children and adolescents. We will administer the tool prospectively to a cohort of pediatric patients from 6 to18 years during the selection stages of ongoing clinical trials. The outcomes of the MacCAT-CR interviews will be compared to a reference standard, established by the judgments of clinical investigators, and an expert panel consisting of child psychiatrists, child psychologists and medical ethicists. The reliability, criterion-related validity and reproducibility of the tool will be determined. As MacCAT-CR is a multi-item scale consisting of 13 items, power was justified at 130–190 subjects, providing a minimum of 10–15 observations per item. MacCAT-CR outcomes will be correlated with age, life experience, IQ, ethnicity, socio-economic status and competence judgment of the parent(s). It is anticipated that 160 participants will be recruited over 2 years to complete enrollment. Discussion A validity study on an assessment tool of competence to consent is strongly needed in research practice, particularly in the child and adolescent population. In this study we will establish
Hein Irma M
Full Text Available Abstract Background Currently over 50% of drugs prescribed to children have not been evaluated properly for use in their age group. One key reason why children have been excluded from clinical trials is that they are not considered able to exercise meaningful autonomy over the decision to participate. Dutch law states that competence to consent can be presumed present at the age of 12 and above; however, in pediatric practice children’s competence is not that clearly presented and the transition from assent to active consent is gradual. A gold standard for competence assessment in children does not exist. In this article we describe a study protocol on the development of a standardized tool for assessing competence to consent in research in children and adolescents. Methods/design In this study we modified the MacCAT-CR, the best evaluated competence assessment tool for adults, for use in children and adolescents. We will administer the tool prospectively to a cohort of pediatric patients from 6 to18 years during the selection stages of ongoing clinical trials. The outcomes of the MacCAT-CR interviews will be compared to a reference standard, established by the judgments of clinical investigators, and an expert panel consisting of child psychiatrists, child psychologists and medical ethicists. The reliability, criterion-related validity and reproducibility of the tool will be determined. As MacCAT-CR is a multi-item scale consisting of 13 items, power was justified at 130–190 subjects, providing a minimum of 10–15 observations per item. MacCAT-CR outcomes will be correlated with age, life experience, IQ, ethnicity, socio-economic status and competence judgment of the parent(s. It is anticipated that 160 participants will be recruited over 2 years to complete enrollment. Discussion A validity study on an assessment tool of competence to consent is strongly needed in research practice, particularly in the child and adolescent population. In
Full Text Available Many researchers collect online survey data because it is cost-effective and less time-consuming than traditional research methods. This paper describes Twitter chats as a research tool vis-à-vis two other online research methods: providing links to electronic surveys to respondents and use of commercially available survey panels through vendors with readily available respondents. Similar to a face-to-face focus group, Twitter chats provide a synchronous environment for participants to answer a structured series of questions and to respond to both the chat facilitator and each other. This paper also reports representative responses from a Twitter chat that explored financial decisions of young adults. The chat was sponsored by a multi-state group of land-grant university researchers, in cooperation with WiseBread, a personal finance website targeted to millennials, to recruit respondents for a more extensive month-long online survey about the financial decisions of young adults. The Twitter chat responses suggest that student loans were the top concern of participants, and debt and housing rounded out the top three concerns. The internet, both websites and social media, was the most frequently cited source of financial information. The article concludes with a discussion of lessons learned from the Twitter chat experience and suggestions for professional practice.
ADENIYI AKINGBADE WAIDI
Full Text Available Questionnaire has to do with questions designed to gather information or data for analysis. Questionnaire has to be adequate, simple, focused and related to the subject which the research is set to achieve and to test the hypotheses and questions that are formulated for the study. But many questionnaires are constructed and administered without following proper guideline which hinders there end result. This paper assesses some of the guides for constructing questionnaire as well as it uses and the extent to which it enhanced manager’s access to reliable data and information. Descriptive method is employed for the study. Findings revealed that poor or badly prepared questionnaire produce questionnaire that does not provide effective results. Managers and researchers that use such questionnaire hardly achieve their organisational and research objectives. The need for good, well prepared and adequate questionnaire is exemplified by its being the primary tool for analytical research. The study recommends that questionnaire be properly prepared for effective research outcome.
Pritchett, Amy R.
While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plugin' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).
Kopton, Isabella M.; Kenning, Peter
Over the last decade, the application of neuroscience to economic research has gained in importance and the number of neuroeconomic studies has grown extensively. The most common method for these investigations is fMRI. However, fMRI has limitations (particularly concerning situational factors) that should be countered with other methods. This review elaborates on the use of functional Near-Infrared Spectroscopy (fNIRS) as a new and promising tool for investigating economic decision making both in field experiments and outside the laboratory. We describe results of studies investigating the reliability of prototype NIRS studies, as well as detailing experiments using conventional and stationary fNIRS devices to analyze this potential. This review article shows that further research using mobile fNIRS for studies on economic decision making outside the laboratory could be a fruitful avenue helping to develop the potential of a new method for field experiments outside the laboratory. PMID:25147517
Convertino, V. A.
Lower body negative pressure (LBNP) has been extensively used for decades in aerospace physiological research as a tool to investigate cardiovascular mechanisms that are associated with or underlie performance in aerospace and military environments. In comparison with clinical stand and tilt tests, LBNP represents a relatively safe methodology for inducing highly reproducible hemodynamic responses during exposure to footward fluid shifts similar to those experienced under orthostatic challenge. By maintaining an orthostatic challenge in a supine posture, removal of leg support (muscle pump) and head motion (vestibular stimuli) during LBNP provides the capability to isolate cardiovascular mechanisms that regulate blood pressure. LBNP can be used for physiological measurements, clinical diagnoses and investigational research comparisons of subject populations and alterations in physiological status. The applications of LBNP to the study of blood pressure regulation in spaceflight, groundbased simulations of low gravity, and hemorrhage have provided unique insights and understanding for development of countermeasures based on physiological mechanisms underlying the operational problems.
Zheng, Yihua; Kuznetsova, Maria M.; Pulkkinen, Antti A.; Maddox, Marlo M.; Mays, Mona Leila
The Space Weather Research Center (http://swrc. gsfc.nasa.gov) at NASA Goddard, part of the Community Coordinated Modeling Center (http://ccmc.gsfc.nasa.gov), is committed to providing research-based forecasts and notifications to address NASA's space weather needs, in addition to its critical role in space weather education. It provides a host of services including spacecraft anomaly resolution, historical impact analysis, real-time monitoring and forecasting, tailored space weather alerts and products, and weekly summaries and reports. In this paper, we focus on how (near) real-time data (both in space and on ground), in combination with modeling capabilities and an innovative dissemination system called the integrated Space Weather Analysis system (http://iswa.gsfc.nasa.gov), enable monitoring, analyzing, and predicting the spacecraft charging environment for spacecraft users. Relevant tools and resources are discussed.
Amon, Krestina L; Campbell, Andrew J; Hawke, Catherine; Steinbeck, Katharine
Researchers are increasingly using social media to recruit participants to surveys and clinical studies. However, the evidence of the efficacy and validity of adolescent recruitment through Facebook is yet to be established. To conduct a systematic review of the literature on the use of Facebook to recruit adolescents for health research. Nine electronic databases and reference lists were searched for articles published between 2004 and 2013. Studies were included in the review if: 1) participants were aged ≥ 10 to ≤ 18 years, 2) studies addressed a physical or mental health issue, 3) Facebook was identified as a recruitment tool, 4) recruitment details using Facebook were outlined in the methods section and considered in the discussion, or information was obtained by contacting the authors, 5) results revealed how many participants were recruited using Facebook, and 6) studies addressed how adolescent consent and/or parental consent was obtained. Titles, abstracts, and keywords were scanned and duplicates removed by 2 reviewers. Full text was evaluated for inclusion criteria, and 2 reviewers independently extracted data. The search resulted in 587 publications, of which 25 full-text papers were analyzed. Six studies met all the criteria for inclusion in the review. Three recruitment methods using Facebook was identified: 1) paid Facebook advertising, 2) use of the Facebook search tool, and 3) creation and use of a Facebook Page. Eligible studies described the use of paid Facebook advertising and Facebook as a search tool as methods to successfully recruit adolescent participants. Online and verbal consent was obtained from participants recruited from Facebook. Copyright © 2014 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.
Miller, Brian W.; Morisette, Jeffrey T.
Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.
Sacchettini, G; Calliera, M
In the Horizon 2020 work programme 2016-17 it is stated that in 2010, 71% of European farm managers were operating on the basis of practical experience only. Education levels greatly vary depending on country, farm managers' age and gender, or farm structures, and this can hamper innovation. Transition towards a more sustainable agriculture requires a renewal and strengthening of the technical skills of all the actors involved and - as a consequence - of the educational system. The EU Directive on the sustainable use of pesticides (EU, 128/2009/EC) requires European Member States to develop training activities targeting occupational exposure to pesticides. The objective of this study is to develop new training tools for operators, addressing the new legal requirements and taking into account what is already available. For this reason, the outcomes of different European and national research projects developed by the Opera Research Centre were used, involving stakeholders in the decision making process, but also considering the real behaviours and perceptions of the final users. As a result, an e-learning tool able to build personalized training programmes, by collecting and integrating existing training material on Plant Protection Products use was developed, together with an e-learning course, with the aim to help operators, advisors and distributors to get prepared for their national certificate test. This work highlights the opportunity to create long-term added value through enhanced collaboration between educators and researchers, and identifies a common set of priorities that has to be taken into account in order to nudge the changes required to achieve a more sustainable use of pesticide and, more in general, sustainable development. Copyright © 2016 Elsevier B.V. All rights reserved.
Wang, Ximing; Documet, Jorge; Garrison, Kathleen A.; Winstein, Carolee J.; Liu, Brent
Stroke is a major cause of adult disability. The Interdisciplinary Comprehensive Arm Rehabilitation Evaluation (I-CARE) clinical trial aims to evaluate a therapy for arm rehabilitation after stroke. A primary outcome measure is correlative analysis between stroke lesion characteristics and standard measures of rehabilitation progress, from data collected at seven research facilities across the country. Sharing and communication of brain imaging and behavioral data is thus a challenge for collaboration. A solution is proposed as a web-based system with tools supporting imaging and informatics related data. In this system, users may upload anonymized brain images through a secure internet connection and the system will sort the imaging data for storage in a centralized database. Users may utilize an annotation tool to mark up images. In addition to imaging informatics, electronic data forms, for example, clinical data forms, are also integrated. Clinical information is processed and stored in the database to enable future data mining related development. Tele-consultation is facilitated through the development of a thin-client image viewing application. For convenience, the system supports access through desktop PC, laptops, and iPAD. Thus, clinicians may enter data directly into the system via iPAD while working with participants in the study. Overall, this comprehensive imaging informatics system enables users to collect, organize and analyze stroke cases efficiently.
There is increasing attention to the centrality of idealization in science. One common view is that models and other idealized representations are important to science, but that they fall short in one or more ways. On this view, there must be an intermediary step between idealized representation and the traditional aims of science, including truth, explanation, and prediction. Here I develop an alternative interpretation of the relationship between idealized representation and the aims of science. I suggest that continuing, widespread idealization calls into question the idea that science aims for truth. If instead science aims to produce understanding, this would enable idealizations to directly contribute to science's epistemic success. I also use the fact of widespread idealization to motivate the idea that science's wide variety aims, epistemic and non-epistemic, are best served by different kinds of scientific products. Finally, I show how these diverse aims—most rather distant from truth—result in the expanded influence of social values on science. Copyright © 2015 Elsevier Ltd. All rights reserved.
Carrie L. Iwema
Full Text Available The time it takes for a completed manuscript to be published traditionally can be extremely lengthy. Article publication delay, which occurs in part due to constraints associated with peer review, can prevent the timely dissemination of critical and actionable data associated with new information on rare diseases or developing health concerns such as Zika virus. Preprint servers are open access online repositories housing preprint research articles that enable authors (1 to make their research immediately and freely available and (2 to receive commentary and peer review prior to journal submission. There is a growing movement of preprint advocates aiming to change the current journal publication and peer review system, proposing that preprints catalyze biomedical discovery, support career advancement, and improve scientific communication. While the number of articles submitted to and hosted by preprint servers are gradually increasing, there has been no simple way to identify biomedical research published in a preprint format, as they are not typically indexed and are only discoverable by directly searching the specific preprint server websites. To address this issue, we created a search engine that quickly compiles preprints from disparate host repositories and provides a one-stop search solution. Additionally, we developed a web application that bolsters the discovery of preprints by enabling each and every word or phrase appearing on any web site to be integrated with articles from preprint servers. This tool, search.bioPreprint, is publicly available at http://www.hsls.pitt.edu/resources/preprint.
Ogao Patrick J
Full Text Available Abstract Background Ever since Dr. John Snow (1813–1854 used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping – all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation in exploring geospatial structures encompassing disease, urban and census mapping. Results Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred
Ogao, Patrick J
Ever since Dr. John Snow (1813-1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping--all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation) in exploring geospatial structures encompassing disease, urban and census mapping. Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities) background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred exploratory tool in identifying, interpreting and
Bickerstaffe, Adrian; Ranaweera, Thilina; Endersby, Travis; Ellis, Christopher; Maddumarachchi, Sanjaya; Gooden, George E; White, Paul; Moses, Eric K; Hewitt, Alex W; Hopper, John L
The Ark is an open-source web-based tool that allows researchers to manage health and medical research data for humans and animals without specialized database skills or programming expertise. The system provides data management for core research information including demographic, phenotype, biospecimen and pedigree data, in addition to supporting typical investigator requirements such as tracking participant consent and correspondence, whilst also being able to generate custom data exports and reports. The Ark is 'study generic' by design and highly configurable via its web interface, allowing researchers to tailor the system to the specific data management requirements of their study. Source code for The Ark can be obtained freely from the website https://github.com/The-Ark-Informatics/ark/ . The source code can be modified and redistributed under the terms of the GNU GPL v3 license. Documentation and a pre-configured virtual appliance can be found at the website http://sphinx.org.au/the-ark/ . firstname.lastname@example.org. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: email@example.com
Garaizar, Pablo; Reips, Ulf-Dietrich
Social networking has surpassed e-mail and instant messaging as the dominant form of online communication (Meeker, Devitt, & Wu, 2010). Currently, all large social networks are proprietary, making it difficult to impossible for researchers to make changes to such networks for the purpose of study design and access to user-generated data from the networks. To address this issue, the authors have developed and present Social Lab, an Internet-based free and open-source social network software system available from http://www.sociallab.es . Having full availability of navigation and communication data in Social Lab allows researchers to investigate behavior in social media on an individual and group level. Automated artificial users ("bots") are available to the researcher to simulate and stimulate social networking situations. These bots respond dynamically to situations as they unfold. The bots can easily be configured with scripts and can be used to experimentally manipulate social networking situations in Social Lab. Examples for setting up, configuring, and using Social Lab as a tool for research in social media are provided.
James S. Bates
Full Text Available Researchers, educators, and practitioners utilize a range of tools and techniques to obtain data, input, feedback, and information from research participants, program learners, and stakeholders. Ketso is both an array of information gathering techniques and a toolkit (see www.ketso.com. It “can be used in any situation when people come together to share information, learn from each other, make decisions and plan actions” (Tippett & How, 2011, p. 4. The word ketso means “action” in the Sesotho language, spoken in the African nation of Lesotho where the concept for this instrument was conceived. Ketso techniques fall into the participatory action research family of social science research methods (Tippett, Handley, & Ravetz, 2007. Ohio State University Extension professionals have used the Ketso toolkit and its techniques in numerous settings, including for professional development, conducting community needs/interests assessments, brainstorming, and data collection. As a toolkit, Ketso uses tactile and colorful leaves, branches, and icons to organize and display participants’ contributions on felt mats. As an array of techniques, Ketso is effective in engaging audiences because it is inclusive and provides each participant a platform for their perspective to be shared.
Veller, van M.G.P.; Gerritsma, W.
Wageningen UR Library has developed a tool based upon co-citation analysis to recommend alternative journals to researchers for a journal they look up in the tool. The journal recommendations can be tuned in such a way to include citation preferences for each of the five science groups that comprise
Smartt, H.; Kuhn, M.; Krementz, D.
The U.S. National Nuclear Security Administration (NNSA) Office of Non-proliferation and Verification Research and Development currently funds research on advanced containment technologies to support Continuity of Knowledge (CoK) objectives for verification regimes. One effort in this area is the Advanced Tools for Maintaining Continuity of Knowledge (ATCK) project. Recognizing that CoK assurances must withstand potential threats from sophisticated adversaries, and that containment options must therefore keep pace with technology advances, the NNSA research and development on advanced containment tools is an important investment. The two ATCK efforts underway at present address the technical containment requirements for securing access points (loop seals) and protecting defined volumes. Multiple U.S. national laboratories are supporting this project: Sandia National Laboratories (SNL), Savannah River National Laboratory (SRNL), and Oak Ridge National Laboratory (ORNL). SNL and SRNL are developing the ''Ceramic Seal,'' an active loop seal that integrates multiple advanced security capabilities and improved efficiency housed within a small-volume ceramic body. The development includes an associated handheld reader and interface software. Currently at the prototype stage, the Ceramic Seal will undergo a series of tests to determine operational readiness. It will be field tested in a representative verification trial in 2016. ORNL is developing the Whole Volume Containment Seal (WCS), a flexible conductive fabric capable of enclosing various sizes and shapes of monitored items. The WCS includes a distributed impedance measurement system for imaging the fabric surface area and passive tamper-indicating features such as permanent-staining conductive ink. With the expected technology advances from the Ceramic Seal and WCS, the ATCK project takes significant steps in advancing containment technologies to help maintain CoK for various verification
Galuvao, Akata Sisigafu'aapulematumua
This article introduces Tofa'a'anolasi, a novel Samoan research framework created by drawing on the work of other Samoan and Pacific education researchers, in combination with adapting the 'Foucauldian tool box' to use for research carried out from a Samoan perspective. The article starts with an account and explanation of the process of…
... out of your control, you can make positive lifestyle changes to lose weight and to maintain a healthy weight. These include a healthy eating plan and being more physically active. Take the Challenge When it comes to aiming for a healthy ...
Laursen, S. L.; Hunter, A.; Weston, T.; Thiry, H.
Evidence-based thinking is essential both to science and to the development of effective educational programs. Thus assessment of student learning—gathering evidence about the nature and depth of students’ learning gains, and about how they arise—is a centerpiece of any effective undergraduate research (UR) program. Assessment data can be used to monitor progress, to diagnose problems, to strengthen program designs, and to report both good outcomes and strategies to improve them to institutional and financial stakeholders in UR programs. While the positive impact of UR on students’ educational, personal and professional development has long been a matter of faith, only recently have researchers and evaluators developed an empirical basis by which to identify and explain these outcomes. Based on this growing body of evidence, URSSA, the Undergraduate Research Student Self-Assessment, is a survey tool that departments and programs can use to assess student outcomes of UR. URSSA focuses on what students learn from their UR experience, rather than whether they liked it. Both multiple-choice and open-ended items focus on students’ gains from UR, including: (1) skills such as lab work and communication; (2) conceptual knowledge and linkages among ideas in their field and with other fields; (3) deepened understanding of the intellectual and practical work of science; (4) growth in confidence and adoption of the identity of scientist; (5) preparation for a career or graduate school in science; and (6) greater clarity in understanding what career or educational path they might wish to pursue. Other items probe students’ participation in important activities that have been shown to lead to these gains; and a set of optional items can be included to probe specific program features that may supplement UR (e.g. field trips, career seminars, housing arrangements). The poster will describe URSSA's content, development, validation, and use. For more information about
De, Baishakhi; Bhandari, Koushik; Mukherjee, Ranjan; Katakam, Prakash; Adiki, Shanta K; Gundamaraju, Rohit; Mitra, Analava
The world has witnessed growing complexities in disease scenario influenced by the drastic changes in host-pathogen- environment triadic relation. Pharmaceutical R&Ds are in constant search of novel therapeutic entities to hasten transition of drug molecules from lab bench to patient bedside. Extensive animal studies and human pharmacokinetics are still the "gold standard" in investigational new drug research and bio-equivalency studies. Apart from cost, time and ethical issues on animal experimentation, burning questions arise relating to ecological disturbances, environmental hazards and biodiversity issues. Grave concerns arises when the adverse outcomes of continued studies on one particular disease on environment gives rise to several other pathogenic agents finally complicating the total scenario. Thus Pharma R&Ds face a challenge to develop bio-waiver protocols. Lead optimization, drug candidate selection with favorable pharmacokinetics and pharmacodynamics, toxicity assessment are vital steps in drug development. Simulation tools like Gastro Plus™, PK Sim®, SimCyp find applications for the purpose. Advanced technologies like organ-on-a chip or human-on-a chip where a 3D representation of human organs and systems can mimic the related processes and activities, thereby linking them to major features of human biology can be successfully incorporated in the drug development tool box. PBPK provides the State of Art to serve as an optional of animal experimentation. PBPK models can successfully bypass bio-equivalency studies, predict bioavailability, drug interactions and on hyphenation with in vitro-in vivo correlation can be extrapolated to humans thus serving as bio-waiver. PBPK can serve as an eco-friendly bio-waiver predictive tool in drug development. Copyright© Bentham Science Publishers; For any queries, please email at firstname.lastname@example.org.
Chen, Chunpeng James; Zhang, Zhiwu
The ultimate goal of genomic research is to effectively predict phenotypes from genotypes so that medical management can improve human health and molecular breeding can increase agricultural production. Genomic prediction or selection (GS) plays a complementary role to genome-wide association studies (GWAS), which is the primary method to identify genes underlying phenotypes. Unfortunately, most computing tools cannot perform data analyses for both GWAS and GS. Furthermore, the majority of these tools are executed through a command-line interface (CLI), which requires programming skills. Non-programmers struggle to use them efficiently because of the steep learning curves and zero tolerance for data formats and mistakes when inputting keywords and parameters. To address these problems, this study developed a software package, named the Intelligent Prediction and Association Tool (iPat), with a user-friendly graphical user interface. With iPat, GWAS or GS can be performed using a pointing device to simply drag and/or click on graphical elements to specify input data files, choose input parameters and select analytical models. Models available to users include those implemented in third party CLI packages such as GAPIT, PLINK, FarmCPU, BLINK, rrBLUP and BGLR. Users can choose any data format and conduct analyses with any of these packages. File conversions are automatically conducted for specified input data and selected packages. A GWAS-assisted genomic prediction method was implemented to perform genomic prediction using any GWAS method such as FarmCPU. iPat was written in Java for adaptation to multiple operating systems including Windows, Mac and Linux. The iPat executable file, user manual, tutorials and example datasets are freely available at http://zzlab.net/iPat. email@example.com.
Blackledge, Matthew D; Collins, David J; Koh, Dow-Mu; Leach, Martin O
We present pyOsiriX, a plugin built for the already popular dicom viewer OsiriX that provides users the ability to extend the functionality of OsiriX through simple Python scripts. This approach allows users to integrate the many cutting-edge scientific/image-processing libraries created for Python into a powerful DICOM visualisation package that is intuitive to use and already familiar to many clinical researchers. Using pyOsiriX we hope to bridge the apparent gap between basic imaging scientists and clinical practice in a research setting and thus accelerate the development of advanced clinical image processing. We provide arguments for the use of Python as a robust scripting language for incorporation into larger software solutions, outline the structure of pyOsiriX and how it may be used to extend the functionality of OsiriX, and we provide three case studies that exemplify its utility. For our first case study we use pyOsiriX to provide a tool for smooth histogram display of voxel values within a user-defined region of interest (ROI) in OsiriX. We used a kernel density estimation (KDE) method available in Python using the scikit-learn library, where the total number of lines of Python code required to generate this tool was 22. Our second example presents a scheme for segmentation of the skeleton from CT datasets. We have demonstrated that good segmentation can be achieved for two example CT studies by using a combination of Python libraries including scikit-learn, scikit-image, SimpleITK and matplotlib. Furthermore, this segmentation method was incorporated into an automatic analysis of quantitative PET-CT in a patient with bone metastases from primary prostate cancer. This enabled repeatable statistical evaluation of PET uptake values for each lesion, before and after treatment, providing estaimes maximum and median standardised uptake values (SUVmax and SUVmed respectively). Following treatment we observed a reduction in lesion volume, SUVmax and SUVmed for
Satpathy, R; Konkimalla, V B; Ratha, J
Microbial dehalogenation is a biochemical process in which the halogenated substances are catalyzed enzymatically in to their non-halogenated form. The microorganisms have a wide range of organohalogen degradation ability both explicit and non-specific in nature. Most of these halogenated organic compounds being pollutants need to be remediated; therefore, the current approaches are to explore the potential of microbes at a molecular level for effective biodegradation of these substances. Several microorganisms with dehalogenation activity have been identified and characterized. In this aspect, the bioinformatics plays a key role to gain deeper knowledge in this field of dehalogenation. To facilitate the data mining, many tools have been developed to annotate these data from databases. Therefore, with the discovery of a microorganism one can predict a gene/protein, sequence analysis, can perform structural modelling, metabolic pathway analysis, biodegradation study and so on. This review highlights various methods of bioinformatics approach that describes the application of various databases and specific tools in the microbial dehalogenation fields with special focus on dehalogenase enzymes. Attempts have also been made to decipher some recent applications of in silico modeling methods that comprise of gene finding, protein modelling, Quantitative Structure Biodegradibility Relationship (QSBR) study and reconstruction of metabolic pathways employed in dehalogenation research area.
Saadet Kuru Cetin
Full Text Available In this study, in-class lesson observations were made with volunteer teachers working in primary and secondary schools using alternative observation tools regarding the scope of contemporary educational supervision. The study took place during the fall and spring semesters of the 2015-2016 and 2016-2017 academic years and the class observations were made with six alternative volunteer teachers in the primary and secondary schools in the provincial and district centers using alternative observation tools. In the classroom observations, the teacher's verbal flow scheme, teacher's movement scheme and student behaviors both during tasks and not, were analyzed. Observations were made during the two classes with teacher's permission. After the first observation, an information meeting was held and then the second observation was made. Following the observations, interviews were held with the teachers. In interviews, the information about the class observations was shared with teachers and their opinions about research were asked. It has been found that alternative observations, in general, have a positive effect on the professional development of teachers. It is concluded that this type of observation approach positively affects teachers' in-class activities, helps in classroom management and teaching arrangements and positively affects student's unwanted behaviors.
This paper attempts to consider the aims that undergraduate physics degree courses actually reflect and serve in the light of the employment patterns of graduates and of the expressed needs of employers. Calling on evidence mainly from the UK, it reviews analyses of what degree examinations actually test, and goes on to quote criticisms of their courses and radical proposals to change them adopted by the senior physics professors in the UK. The discussion is then broadened by discussion of evidence, about the employment of graduates and about the priorities that some industrialists now give in the qualities that they look for when recruiting new graduates. The evidence leads to a view that radical changes are needed, both in courses and examinations, and that there is a need for university departments to work more closely with employers in re-formulating the aims and priorities in their teaching.
Jankowski, Katherine R B; Flannelly, Kevin J; Flannelly, Laura T
The t-test developed by William S. Gosset (also known as Student's t-test and the two-sample t-test) is commonly used to compare one sample mean on a measure with another sample mean on the same measure. The outcome of the t-test is used to draw inferences about how different the samples are from each other. It is probably one of the most frequently relied upon statistics in inferential research. It is easy to use: a researcher can calculate the statistic with three simple tools: paper, pen, and a calculator. A computer program can quickly calculate the t-test for large samples. The ease of use can result in the misuse of the t-test. This article discusses the development of the original t-test, basic principles of the t-test, two additional types of t-tests (the one-sample t-test and the paired t-test), and recommendations about what to consider when using the t-test to draw inferences in research.
Vecchiato, Giovanni; Astolfi, Laura; De Vico Fallani, Fabrizio; Toppi, Jlenia; Aloise, Fabio; Bez, Francesco; Wei, Daming; Kong, Wanzeng; Dai, Jounging; Cincotti, Febo; Mattia, Donatella; Babiloni, Fabio
Here we present an overview of some published papers of interest for the marketing research employing electroencephalogram (EEG) and magnetoencephalogram (MEG) methods. The interest for these methodologies relies in their high-temporal resolution as opposed to the investigation of such problem with the functional Magnetic Resonance Imaging (fMRI) methodology, also largely used in the marketing research. In addition, EEG and MEG technologies have greatly improved their spatial resolution in the last decades with the introduction of advanced signal processing methodologies. By presenting data gathered through MEG and high resolution EEG we will show which kind of information it is possible to gather with these methodologies while the persons are watching marketing relevant stimuli. Such information will be related to the memorization and pleasantness related to such stimuli. We noted that temporal and frequency patterns of brain signals are able to provide possible descriptors conveying information about the cognitive and emotional processes in subjects observing commercial advertisements. These information could be unobtainable through common tools used in standard marketing research. We also show an example of how an EEG methodology could be used to analyze cultural differences between fruition of video commercials of carbonated beverages in Western and Eastern countries.
Full Text Available Here we present an overview of some published papers of interest for the marketing research employing electroencephalogram (EEG and magnetoencephalogram (MEG methods. The interest for these methodologies relies in their high-temporal resolution as opposed to the investigation of such problem with the functional Magnetic Resonance Imaging (fMRI methodology, also largely used in the marketing research. In addition, EEG and MEG technologies have greatly improved their spatial resolution in the last decades with the introduction of advanced signal processing methodologies. By presenting data gathered through MEG and high resolution EEG we will show which kind of information it is possible to gather with these methodologies while the persons are watching marketing relevant stimuli. Such information will be related to the memorization and pleasantness related to such stimuli. We noted that temporal and frequency patterns of brain signals are able to provide possible descriptors conveying information about the cognitive and emotional processes in subjects observing commercial advertisements. These information could be unobtainable through common tools used in standard marketing research. We also show an example of how an EEG methodology could be used to analyze cultural differences between fruition of video commercials of carbonated beverages in Western and Eastern countries.
Here we provide an update on construction of the five NEON Mobile Deployment Platforms (MDPs) as well as a description of the infrastructure and sensors available to researchers in the near future. Additionally, we include information (i.e. timelines and procedures) on requesting MDPs for PI led projects. The MDPs will provide the means to observe stochastic or spatially important events, gradients, or quantities that cannot be reliably observed using fixed location sampling (e.g. fires and floods). Due to the transient temporal and spatial nature of such events, the MDPs are designed to accommodate rapid deployment for time periods up to 1 year. Broadly, the MDPs are comprised of infrastructure and instrumentation capable of functioning individually or in conjunction with one another to support observations of ecological change, as well as education, training and outreach. More specifically, the MDPs include the capability to make tower based measures of ecosystem exchange, radiation, and precipitation in conjunction with baseline soils data such as CO2 flux, and soil temperature and moisture. An aquatics module is also available with the MDP to facilitate research integrating terrestrial and aquatic processes. Ultimately, the NEON MDPs provides a tool for linking PI led research to the continental scale data sets collected by NEON.
Supreet Kaur Gill
Full Text Available Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.
Goldfarb, L.; Yang, A.
Leah Goldfarb, Paul Cutler, Andrew Yang*, Mustapha Mokrane, Jacinta Legg and Deliang Chen The scientific community has been engaged in developing an international strategy on Earth system research. The initial consultation in this “visioning” process focused on gathering suggestions for Earth system research priorities that are interdisciplinary and address the most pressing societal issues. It was implemented this through a website that utilized Web 2.0 capabilities. The website (http://www.icsu-visioning.org/) collected input from 15 July to 1 September 2009. This consultation was the first in which the international scientific community was asked to help shape the future of a research theme. The site attracted over 7000 visitors from 133 countries, more than 1000 of whom registered and took advantage of the site’s functionality to contribute research questions (~300 questions), comment on posts, and/or vote on questions. To facilitate analysis of results, the site captured a small set of voluntary information about each contributor and their contribution. A group of ~50 international experts were invited to analyze the inputs at a “Visioning Earth System Research” meeting held in September 2009. The outcome of this meeting—a prioritized list of research questions to be investigated over the next decade—was then posted on the visioning website for additional comment from the community through an online survey tool. In general, many lessons were learned in the development and implementation of this website, both in terms of the opportunities offered by Web 2.0 capabilities and the application of these capabilities. It is hoped that this process may serve as a model for other scientific communities. The International Council for Science (ICSU) in cooperation with the International Social Science Council (ISSC) is responsible for organizing this Earth system visioning process.
Binello, E.; Mitchell, R.N.; Harling, O.K.
An immunologic tool based on manipulation of the boron neutron capture reaction was previously proposed in the context of heart transplantation research to examine the temporal relationship between parenchymal rejection (representing immune cell infiltration) and transplantation-associated arteriosclerosis (characterized by progressive vascular occlusion). Critical to the development of this method is the uptake of boron by specific cells of the immune system, namely T cells, without adverse effects on cell function, which may be assessed by the ability of boron-loaded cells to produce IFNγ, a protein with substantial impact on rejection. This work presents the evaluation of two carboranyl thymidine analogs. Advantages of this type of boron compound are reduced risk of leakage and effective dose delivery based on their incorporation into cellular nuclear material. Results indicate that uptake of these boronated nucleosides is high with no adverse effects on cell function, thereby warranting the continued development of this technique that has potentially wide applicability in immunological models
The report includes the following chapters: (1) Introduction: ozone in the atmosphere, anthropogenic influence on the ozone layer, polar stratospheric ozone loss; (2) Tracer-tracer relations in the stratosphere: tracer-tracer relations as a tool in atmospheric research; impact of cosmic-ray-induced heterogeneous chemistry on polar ozone; (3) quantifying polar ozone loss from ozone-tracer relations: principles of tracer-tracer correlation techniques; reference ozone-tracer relations in the early polar vortex; impact of mixing on ozone-tracer relations in the polar vortex; impact of mesospheric intrusions on ozone-tracer relations in the stratospheric polar vortex calculation of chemical ozone loss in the arctic in March 2003 based on ILAS-II measurements; (4) epilogue.
Nelson, Douglas G; Byus, Kent
Contemporary public health requires the support and participation of its constituency. This study assesses the capacity of consumption value theory to identify the basis of this support. A telephone survey design used simple random sampling of adult residents of Cherokee County, Oklahoma. Factor analysis and stepwise discriminant analysis was used to identify and classify personal and societal level support variables. Most residents base societal level support on epistemic values. Direct services clientele base their support on positive emotional values derived from personal contact and attractive programs. Residents are curious about public health and want to know more about the health department. Where marketing the effectiveness of public health programs would yield relatively little support, marketing health promotion activities may attract public opposition. This formative research tool suggests a marketing strategy for public health practitioners.
Full Text Available Single molecule studies have expanded rapidly over the past decade and have the ability to provide an unprecedented level of understanding of biological systems. A common challenge upon introduction of novel, data-rich approaches is the management, processing, and analysis of the complex data sets that are generated. We provide a standardized approach for analyzing these data in the freely available software package SMART: Single Molecule Analysis Research Tool. SMART provides a format for organizing and easily accessing single molecule data, a general hidden Markov modeling algorithm for fitting an array of possible models specified by the user, a standardized data structure and graphical user interfaces to streamline the analysis and visualization of data. This approach guides experimental design, facilitating acquisition of the maximal information from single molecule experiments. SMART also provides a standardized format to allow dissemination of single molecule data and transparency in the analysis of reported data.
Verma, A.K.; Varde, P.V.; Sankar, S.; Prakash, P.
A prototype Knowledge Based (KB) operator Adviser (OPAD) system has been developed for 100 MW(th) Heavy Water moderated, cooled and Natural Uranium fueled research reactor. The development objective of this system is to improve reliability of operator action and hence the reactor safety at the time of crises as well as normal operation. The jobs performed by this system include alarm analysis, transient identification, reactor safety status monitoring, qualitative fault diagnosis and procedure generation in reactor operation. In order to address safety objectives at various stages of the Operator Adviser (OPAD) system development the Knowledge has been structured using PSA tools/information in an shell environment. To demonstrate the feasibility of using a combination of KB approach with PSA for operator adviser system, salient features of some of the important modules (viz. FUELEX, LOOPEX and LOCAEX) have been discussed. It has been found that this system can serve as an efficient operator support system
Gholami, Jaleh; Majdzadeh, Reza; Nedjat, Saharnaz; Nedjat, Sima; Maleki, Katayoun; Ashoorkhani, Mahnaz; Yazdizadeh, Bahareh
The knowledge translation self-assessment tool for research institutes (SATORI) was designed to assess the status of knowledge translation in research institutes. The objective was, to identify the weaknesses and strengths of knowledge translation in research centres and faculties associated with Tehran University of Medical Sciences (TUMS). The tool, consisting of 50 statements in four main domains, was used in 20 TUMS-affiliated research centres and departments after its reliability was established. It was completed in a group discussion by the members of the research council, researchers and research users' representatives from each centre and/or department. The mean score obtained in the four domains of 'The question of research', 'Knowledge production', 'Knowledge transfer' and 'Promoting the use of evidence' were 2.26, 2.92, 2 and 1.89 (out of 5) respectively.Nine out of 12 interventional priorities with the lowest quartile score were related to knowledge transfer resources and strategies, whereas eight of them were in the highest quartile and related to 'The question of research' and 'Knowledge production'. The self-assessment tool identifies the gaps in capacity and infrastructure of knowledge translation support within research organizations. Assessment of research institutes using SATORI pointed out that strengthening knowledge translation through provision of financial support for knowledge translation activities, creating supportive and facilitating infrastructures, and facilitating interactions between researchers and target audiences to exchange questions and research findings are among the priorities of research centres and/or departments.
Cassandras, Christos G; Gong, Weibo; Pepyne, David L; Lee, Wenke; Liu, Hong; Ho, Yu-Chi; Pfeffer, Avrom
The specific aims of this research is to develop theories, methodologies, tools, and implementable solutions for modeling, analyzing, designing, and securing information networks against information-based attack...
Torres, Samantha; de la Riva, Erika E; Tom, Laura S; Clayman, Marla L; Taylor, Chirisse; Dong, Xinqi; Simon, Melissa A
Despite increasing need to boost the recruitment of underrepresented populations into cancer trials and biobanking research, few tools exist for facilitating dialogue between researchers and potential research participants during the recruitment process. In this paper, we describe the initial processes of a user-centered design cycle to develop a standardized research communication tool prototype for enhancing research literacy among individuals from underrepresented populations considering enrollment in cancer research and biobanking studies. We present qualitative feedback and recommendations on the prototype's design and content from potential end users: five clinical trial recruiters and ten potential research participants recruited from an academic medical center. Participants were given the prototype (a set of laminated cards) and were asked to provide feedback about the tool's content, design elements, and word choices during semi-structured, in-person interviews. Results suggest that the prototype was well received by recruiters and patients alike. They favored the simplicity, lay language, and layout of the cards. They also noted areas for improvement, leading to card refinements that included the following: addressing additional topic areas, clarifying research processes, increasing the number of diverse images, and using alternative word choices. Our process for refining user interfaces and iterating content in early phases of design may inform future efforts to develop tools for use in clinical research or biobanking studies to increase research literacy.
Riggs, E. M.
beginners. Thus researchers must embrace the uncontrolled nature of the setting, the qualitative nature of the data collected, and the researcher's role in interpreting geologically appropriate actions as evidence of successful problem solving and investigation. Working to understand the role of diversity and culture in the geosciences also involves a wide array of theory, from affective issues through culturally and linguistically-influenced cognition, through gender, self-efficacy, and many other areas of inquiry. Research in understanding spatial skills draws heavily on techniques from cognition research but also must involve the field-specific knowledge of geoscientists to infuse these techniques with exemplars, a catalog of meaningful actions by students, and an understanding of how to recognize success. These examples illustrate briefly the wide array of tools from other fields that is being brought to bear to advance rigorous geoscience education research. We will illustrate a few of these and the insights we have gained, and the power of theory and method from other fields to enlighten us as we attempt to educate a broader array of earth scientists.
Hoffmeyer, Mikkeline; Jensen, Jesper Juellund; Olsen, Marie Veisegaard
Digital multimodal production is becoming increasingly important as a 21st century skill and as a learning condition in school (K-12). Moreover, there is a growing attention to the significance of criteria-based assessment for learning. Nevertheless, assessment of students’ digital multimodal...... productions is often vague or lacking. Therefore, the research project aims at developing a tool to support assessment of student’s digital multimodal productions through a design-based research method. This paper presents a proposal for issues to be considered through a prototyping phase, based on interviews...
Full Text Available Digital tool making offers many challenges, involving much trial and error. Developing machine learning and assistance in automated and semi-automated Internet resource discovery, metadata generation, and rich-text identification provides opportunities for great discovery, innovation, and the potential for transformation of the library community. The areas of computer science involved, as applied to the library applications addressed, are among that discipline’s leading edges. Making applied research practical and applicable, through placement within library/collection-management systems and services, involves equal parts computer scientist, research librarian, and legacy-systems archaeologist. Still, the early harvest is there for us now, with a large harvest pending. Data Fountains and iVia, the projects discussed, demonstrate this. Clearly, then, the present would be a good time for the library community to more proactively and significantly engage with this technology and research, to better plan for its impacts, to more proactively take up the challenges involved in its exploration, and to better and more comprehensively guide effort in this new territory. The alternative to doing this is that others will develop this territory for us, do it not as well, and sell it back to us at a premium. Awareness of this technology and its current capabilities, promises, limitations, and probable major impacts needs to be generalized throughout the library management, metadata, and systems communities. This article charts recent work, promising avenues for new research and development, and issues the library community needs to understand.
Holmes, Bruce J.; Sawhill, Bruce K.; Herriot, James; Seehart, Ken; Zellweger, Dres; Shay, Rick
The objective of this research by NextGen AeroSciences, LLC is twofold: 1) to deliver an initial "toolbox" of algorithms, agent-based structures, and method descriptions for introducing trajectory agency as a methodology for simulating and analyzing airspace states, including bulk properties of large numbers of heterogeneous 4D aircraft trajectories in a test airspace -- while maintaining or increasing system safety; and 2) to use these tools in a test airspace to identify possible phase transition structure to predict when an airspace will approach the limits of its capacity. These 4D trajectories continuously replan their paths in the presence of noise and uncertainty while optimizing performance measures and performing conflict detection and resolution. In this approach, trajectories are represented as extended objects endowed with pseudopotential, maintaining time and fuel-efficient paths by bending just enough to accommodate separation while remaining inside of performance envelopes. This trajectory-centric approach differs from previous aircraft-centric distributed approaches to deconfliction. The results of this project are the following: 1) we delivered a toolbox of algorithms, agent-based structures and method descriptions as pseudocode; and 2) we corroborated the existence of phase transition structure in simulation with the addition of "early warning" detected prior to "full" airspace. This research suggests that airspace "fullness" can be anticipated and remedied before the airspace becomes unsafe.
Wilson, G.E.; Boyack, B.E.
Best Estimate computer codes have been accepted by the US Nuclear Regulatory Commission as an optional tool for performing safety analysis related to the licensing and regulation of current nuclear reactors producing commercial electrical power, providing their uncertainty is quantified. In support of this policy change, the NRC and its contractors and consultants have developed and demonstrated an uncertainty quantification methodology called CSAU. At the process level, the method is generic to any application which relies on best estimate computer code simulations to determine safe operating margins. The primary use of the CSAU methodology is to quantify safety margins for existing designs; however, the methodology can also serve an equally important role in advanced reactor research for plants not yet built. Applied early, during the period when alternate designs are being evaluated, the methodology can identify the relative importance of the sources of uncertainty in the knowledge of each plant behavior and, thereby, help prioritize the research needed to bring the new designs to fruition. This paper describes the CSAU methodology, at the generic process level, and provides the general principles whereby it may be applied to evaluations of advanced reactor designs. 9 refs., 1 fig., 1 tab
Full Text Available The purpose of this paper is to showcase the information literacy course for doctoral students called Information Resources and Tools for Research. Turku University Library organises this course in collaboration with the University of Turku Graduate School. The course, which was started in 2012, has been organised four times so far, twice in English and twice in Finnish. The course offers training to all doctoral Programs in all of the seven disciplines present at the University of Turku and doctoral candidates of the University. In our presentation we will describe the structure and contents of the course and share our experiences of the collaboration with the University of Turku Graduate School. In addition, we will describe how the information specialists of the Turku University Library have collaborated during the course. We will also discuss the challenges of the course. Based on the course feedback, it can be stated that in general, participants have found this course very useful for their research in the University of Turku.
Full Text Available Rutgers Cooperative Extension developed an online self-assessment tool called the Personal Health and Finance Quiz available at http://njaes.rutgers.edu/money/health-finance-quiz/. Believed to be among the first public surveys to simultaneously query users about their health and personal finance practices, the quiz is part of Small Steps to Health and Wealth™ (SSHW, a Cooperative Extension program developed to motivate Americans to take action to improve both their health and personal finances (see http://njaes.rutgers.edu/sshw/. Respondents indicate one of four frequencies for performance of 20 daily activities and receive a Health, Finance, and Total score indicating their frequency of performing activities that health and financial experts recommend. In addition to providing users with personalized feedback, the quiz collects data for research about the health and financial practices of Americans to inform future Extension outreach and can be used as a pre-/post-test to evaluate the impact of SSHW programs. Initial research analyses are planned for 2015.
Roysri, Krisana; Chotipanich, Chanisa; Laopaiboon, Vallop; Khiewyoo, Jiraporn
Diagnostic nuclear medicine is being increasingly employed in clinical practice with the advent of new technologies and radiopharmaceuticals. The report of the prevalence of a certain disease is important for assessing the quality of that article. Therefore, this study was performed to evaluate the quality of published nuclear medicine articles and determine the frequency of reporting the prevalence of studied diseases. We used Standards for Reporting of Diagnostic Accuracy (STARD) and Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2) checklists for evaluating the quality of articles published in five nuclear medicine journals with the highest impact factors in 2012. The articles were retrieved from Scopus database and were selected and assessed independently by two nuclear medicine physicians. Decision concerning equivocal data was made by consensus between the reviewers. The average STARD score was approximately 17 points, and the highest score was 17.19±2.38 obtained by the European Journal of Nuclear Medicine. QUADAS-2 tool showed that all journals had low bias regarding study population. The Journal of Nuclear Medicine had the highest score in terms of index test, reference standard, and time interval. Lack of clarity regarding the index test, reference standard, and time interval was frequently observed in all journals including Clinical Nuclear Medicine, in which 64% of the studies were unclear regarding the index test. Journal of Nuclear Cardiology had the highest number of articles with appropriate reference standard (83.3%), though it had the lowest frequency of reporting disease prevalence (zero reports). All five journals had the same STARD score, while index test, reference standard, and time interval were very unclear according to QUADAS-2 tool. Unfortunately, data were too limited to determine which journal had the lowest risk of bias. In fact, it is the author's responsibility to provide details of research methodology so that the
van Vught, Frans; Westerheijden, Don F.
This paper sets out to analyse the need for better "transparency tools" which inform university stakeholders about the quality of universities. First, we give an overview of what we understand by the concept of transparency tools and those that are currently available. We then critique current transparency tools' methodologies, looking in detail…
Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom
This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…
Nazem, Amir; Mansoori, G Ali
A century of research has passed since the discovery and definition of Alzheimer's disease (AD), the primary common dementing disorder worldwide. However, AD lacks definite diagnostic approaches and effective cure at the present. Moreover, the currently available diagnostic tools are not sufficient for an early screening of AD in order to start preventive approaches. Recently the emerging field of nanotechnology has promised new techniques to solve some of the AD challenges. Nanotechnology refers to the techniques of designing and manufacturing nanosize (1-100 nm) structures through controlled positional and/or self-assembly of atoms and molecules. In this report, we present the promises that nanotechnology brings in research on the AD diagnosis and therapy. They include its potential for the better understanding of the AD root cause molecular mechanisms, AD's early diagnoses, and effective treatment. The advances in AD research offered by the atomic force microscopy, single molecule fluorescence microscopy and NanoSIMS microscopy are examined here. In addition, the recently proposed applications of nanotechnology for the early diagnosis of AD including bio-barcode assay, localized surface plasmon resonance nanosensor, quantum dot and nanomechanical cantilever arrays are analyzed. Applications of nanotechnology in AD therapy including neuroprotections against oxidative stress and anti-amyloid therapeutics, neuroregeneration and drug delivery beyond the blood brain barrier (BBB) are discussed and analyzed. All of these applications could improve the treatment approach of AD and other neurodegenerative diseases. The complete cure of AD may become feasible by a combination of nanotechnology and some other novel approaches, like stem cell technology.
Khayat, K; Salter, B
Recent policy developments, embracing the notions of consumer choice, quality of care, and increased general practitioner control over practice budgets have resulted in a new competitive environment in primary care. General practitioners must now be more aware of how their patients feel about the services they receive, and patient satisfaction surveys can be an effective tool for general practices. A survey was undertaken to investigate the use of a patient satisfaction survey and whether aspects of patient satisfaction varied according to sociodemographic characteristics such as age, sex, social class, housing tenure and length of time in education. A sample of 2173 adults living in Medway District Health Authority were surveyed by postal questionnaire in September 1991 in order to elicit their views on general practice services. Levels of satisfaction varied with age, with younger people being consistently less satisfied with general practice services than older people. Women, those in social classes 1-3N, home owners and those who left school aged 17 years or older were more critical of primary care services than men, those in social classes 3M-5, tenants and those who left school before the age of 17 years. Surveys and analyses of this kind, if conducted for a single practice, can form the basis of a marketing strategy aimed at optimizing list size, list composition, and service quality. Satisfaction surveys can be readily incorporated into medical audit and financial management.
Amin, Waqas; Kang, Hyunseok P; Egloff, Ann Marie; Singh, Harpreet; Trent, Kerry; Ridge-Hetrick, Jennifer; Seethala, Raja R; Grandis, Jennifer; Parwani, Anil V
The Specialized Program of Research Excellence (SPORE) in Head and Neck Cancer neoplasm virtual biorepository is a bioinformatics-supported system to incorporate data from various clinical, pathological, and molecular systems into a single architecture based on a set of common data elements (CDEs) that provides semantic and syntactic interoperability of data sets. The various components of this annotation tool include the Development of Common Data Elements (CDEs) that are derived from College of American Pathologists (CAP) Checklist and North American Association of Central Cancer Registries (NAACR) standards. The Data Entry Tool is a portable and flexible Oracle-based data entry device, which is an easily mastered web-based tool. The Data Query Tool helps investigators and researchers to search de-identified information within the warehouse/resource through a 'point and click' interface, thus enabling only the selected data elements to be essentially copied into a data mart using a multi dimensional model from the warehouse's relational structure. The SPORE Head and Neck Neoplasm Database contains multimodal datasets that are accessible to investigators via an easy to use query tool. The database currently holds 6553 cases and 10607 tumor accessions. Among these, there are 965 metastatic, 4227 primary, 1369 recurrent, and 483 new primary cases. The data disclosure is strictly regulated by user's authorization. The SPORE Head and Neck Neoplasm Virtual Biorepository is a robust translational biomedical informatics tool that can facilitate basic science, clinical, and translational research. The Data Query Tool acts as a central source providing a mechanism for researchers to efficiently find clinically annotated datasets and biospecimens that are relevant to their research areas. The tool protects patient privacy by revealing only de-identified data in accordance with regulations and approvals of the IRB and scientific review committee
PET imaging has for many years been a versatile tool for non-invasive imaging of neuro-physiology and, indeed, whole body physiology. Quantitative PET imaging of trace amounts of radioactivity is scientifically elegant and can be very complex. This lecture focuses on whether and where this test is clinically useful. Because of the research tradition, PET imaging has been perceived as an 'expensive' test, as it costs more per scan than CT and MRI scans at most institutions. Such a superficial analysis is incorrect, however, as it is increasingly recognized that imaging costs, which in some circumstances will be increased by the use of PET, are only a relatively small component of patient care costs. Thus, PET may raise imaging costs and the number of imaging procedures in some settings, though PET may reduce imaging test numbers in other settings. However, the analysis must focus on the total costs of patient management. Analyses focused on total patient care costs, including cost of hospitalization and cost surgery as well as imaging costs, have shown that PET can substantially reduce total patient care costs in several settings. This is achieved by providing a more accurate diagnosis, and thus having fewer instances of an incorrect diagnosis resulting in subsequent inappropriate surgery or investigations. Several institutions have shown scenarios in which PET for tumor imaging is cost effective. While the specific results of the analyses vary based on disease prevalence and cost input values for each procedure, as well as the projected performance of PET, the similar results showing total care cost savings in the management of several common cancers, strongly supports the rational for the use of PET in cancer management. In addition, promising clinical results are forthcoming in several other illnesses, suggesting PET will have broader utility than these uses, alone. Thus, while PET is an 'expensive' imaging procedure and has considerable utility as a research
Bioluminescent imaging (BLI) is a non-invasive imaging modality widely used in the field of pre-clinical oncology research. Imaging of small animal tumour models using BLI involves the generation of light by luciferase-expressing cells in the animal following administration of substrate. This light may be imaged using an external detector. The technique allows a variety of tumour-associated properties to be visualized dynamically in living models. The increasing use of BLI as a small-animal imaging modality has led to advances in the development of xenogeneic, orthotopic, and genetically engineered animal models expressing luciferase genes. This review aims to provide insight into the principles of BLI and its applications in cancer research. Many studies to assess tumour growth and development, as well as efficacy of candidate therapeutics, have been performed using BLI. More recently, advances have also been made using bioluminescent imaging in studies of protein-protein interactions, genetic screening, cell-cycle regulators, and spontaneous cancer development. Such novel studies highlight the versatility and potential of bioluminescent imaging in future oncological research.
Kragelund, Signe H; Kjærsgaard, Mona; Jensen-Fangel, Søren; Leth, Rita A; Ank, Nina
The aim of this study was to develop an audit tool with a built-in database using Research Electronic Data Capture (REDCap®) as part of an antimicrobial stewardship program at a regional hospital in the Central Denmark Region, and to analyse the need, if any, to involve more than one expert in the evaluation of cases of antimicrobial treatment, and the level of agreement among the experts. Patients treated with systemic antimicrobials in the period from 1 September 2015 to 31 August 2016 were included, in total 722 cases. Data were collected retrospectively and entered manually. The audit was based on seven flow charts regarding: (1) initiation of antimicrobial treatment (2) infection (3) prescription and administration of antimicrobials (4) discontinuation of antimicrobials (5) reassessment within 48 h after the first prescription of antimicrobials (6) microbiological sampling in the period between suspicion of infection and the first administration of antimicrobials (7) microbiological results. The audit was based on automatic calculations drawing on the entered data and on expert assessments. Initially, two experts completed the audit, and in the cases in which they disagreed, a third expert was consulted. In 31.9% of the cases, the two experts agreed on all elements of the audit. In 66.2%, the two experts reached agreement by discussing the cases. Finally, 1.9% of the cases were completed in cooperation with a third expert. The experts assessed 3406 flow charts of which they agreed on 75.8%. We succeeded in creating an audit tool with a built-in database that facilitates independent expert evaluation using REDCap. We found a large inter-observer difference that needs to be considered when constructing a project based on expert judgements. Our two experts agreed on most of the flow charts after discussion, whereas the third expert's intervention did not have any influence on the overall assessment. Copyright © 2018 Elsevier Inc. All rights reserved.
A descriptive qualitative research design was used to determine whether participants ... simulation as a teaching method; a manikin offering effective learning; confidence ..... Tesch R. Qualitative Research: Analysis Types and Software Tools.
Powers, Christina M.; Grieger, Khara D.; Hendren, Christine Ogilvie; Meacham, Connie A.; Gurevich, Gerald; Lassiter, Meredith Gooding; Money, Eric S.; Lloyd, Jennifer M.; Beaulieu, Stephen M.
Prioritizing and assessing risks associated with chemicals, industrial materials, or emerging technologies is a complex problem that benefits from the involvement of multiple stakeholder groups. For example, in the case of engineered nanomaterials (ENMs), scientific uncertainties exist that hamper environmental, health, and safety (EHS) assessments. Therefore, alternative approaches to standard EHS assessment methods have gained increased attention. The objective of this paper is to describe the application of a web-based, interactive decision support tool developed by the U.S. Environmental Protection Agency (U.S. EPA) in a pilot study on ENMs. The piloted tool implements U.S. EPA's comprehensive environmental assessment (CEA) approach to prioritize research gaps. When pursued, such research priorities can result in data that subsequently improve the scientific robustness of risk assessments and inform future risk management decisions. Pilot results suggest that the tool was useful in facilitating multi-stakeholder prioritization of research gaps. Results also provide potential improvements for subsequent applications. The outcomes of future CEAWeb applications with larger stakeholder groups may inform the development of funding opportunities for emerging materials across the scientific community (e.g., National Science Foundation Science to Achieve Results [STAR] grants, National Institutes of Health Requests for Proposals). - Highlights: • A web-based, interactive decision support tool was piloted for emerging materials. • The tool (CEAWeb) was based on an established approach to prioritize research gaps. • CEAWeb facilitates multi-stakeholder prioritization of research gaps. • We provide recommendations for future versions and applications of CEAWeb
Skoeld, T.; Feldman, Y.
The IAEA Department of Safeguards aims to provide credible assurances to the international community that States are fulfiling their safeguards obligations in that all nuclear material remains in peaceful use. In order to draw a soundly-based safeguards conclusion for a State that has a safeguards agreement in force with the IAEA, the Department establishes a knowledge base of the State's nuclear-related infrastructure and activities against which a State's declarations are evaluated for correctness and completeness. Open source information is one stream of data that is used in the evaluation of nuclear fuel cycle activities in the State. The Department is continuously working to ensure that it has access to the most up-to-date, accurate, relevant and credible open source information available, and has begun to examine the use of social media as a new source of information. The use of social networking sites has increased exponentially in the last decade. In fact, social media has emerged as the key vehicle for delivering and acquiring information in near real-time. Therefore, it has become necessary for the open source analyst to consider social media as an essential element in the broader concept of open source information. Characteristics, such as ''immediacy'', ''recency'', ''interractiveness'', which set social networks apart from the ''traditional media'', are also the same attributes that present a challenge for using social media as an efficient information-delivery platform and a credible source of information. New tools and technologies for social media analytics have begun to emerge to help systematically monitor and mine this large body of data. The paper will survey the social media landscape in an effort to identify platforms that could be of value for safeguards verification purposes. It will explore how a number of social networking sites, such as Twitter
Adeline Phaik Harn Chua; Kenneth R. Deans; Craig M. Parker
Blogs appear to be gaining momentum as a marketing tool which can be used by organisations for such strategies and processes as branding, managing reputation, developing customer trust and loyalty, niche marketing, gathering marketing intelligence and promoting their online presence. There has been limited academic research in this area, and most significantly concerning the types of small and medium enterprises (SMEs) for which blogs might have potential as a marketing tool. In an attempt to...
Eloranta, E. W.; Spuler, S.; Hayman, M. M.
Many aspects of air quality research require information on the vertical distribution of pollution. Traditional measurements, obtained from surface based samplers, or passive satellite remote sensing, do not provide vertical profiles. Lidar can provide profiles of aerosol properties. However traditional backscatter lidar suffers from uncertain calibrations with poorly constrained algorithms. These problems are avoided using High Spectral Resolution Lidar (HSRL) which provides absolutely calibrated vertical profiles of aerosol properties. The University of Wisconsin HSRL systems measure 532 nm wavelength aerosol backscatter cross-sections, extinction cross-sections, depolarization, and attenuated 1064 nm backscatter. These instruments are designed for long-term deployment at remote sites with minimal local support. Processed data is provided for public viewing and download in real-time on our web site "http://hsrl.ssec.wisc.edu". Air pollution applications of HSRL data will be illustrated with examples acquired during air quality field programs including; KORUS-AQ, DISCOVER-AQ, LAMOS and FRAPPE. Observations include 1) long range transport of dust, air pollution and smoke. 2) Fumigation episodes where elevated pollution is mixed down to the surface. 3) visibility restrictions by aerosols and 4) diurnal variations in atmospheric optical depth. While HSRL is powerful air quality research tool, its application in routine measurement networks is hindered by the high cost of current systems. Recent technical advances promise a next generation HSRL using telcom components to greatly reduce system cost. This paper will present data generated by a prototype low cost system constructed at NCAR. In addition to lower cost, operation at a non-visible near 780 nm infrared wavelength removes all FAA restrictions on the operation.
Full Text Available We introduce the notion of Electric Field Encephalography (EFEG based on measuring electric fields of the brain and demonstrate, using computer modeling, that given the appropriate electric field sensors this technique may have significant advantages over the current EEG technique. Unlike EEG, EFEG can be used to measure brain activity in a contactless and reference-free manner at significant distances from the head surface. Principal component analysis using simulated cortical sources demonstrated that electric field sensors positioned 3 cm away from the scalp and characterized by the same signal-to-noise ratio as EEG sensors provided the same number of uncorrelated signals as scalp EEG. When positioned on the scalp, EFEG sensors provided 2-3 times more uncorrelated signals. This significant increase in the number of uncorrelated signals can be used for more accurate assessment of brain states for non-invasive brain-computer interfaces and neurofeedback applications. It also may lead to major improvements in source localization precision. Source localization simulations for the spherical and Boundary Element Method (BEM head models demonstrated that the localization errors are reduced two-fold when using electric fields instead of electric potentials. We have identified several techniques that could be adapted for the measurement of the electric field vector required for EFEG and anticipate that this study will stimulate new experimental approaches to utilize this new tool for functional brain research.
North, M. J. N.
Argonne National Laboratory (ANL) has worked closely with Western Area Power Administration (Western) over many years to develop a variety of electric power marketing and transmission system models that are being used for ongoing system planning and operation as well as analytic studies. Western markets and delivers reliable, cost-based electric power from 56 power plants to millions of consumers in 15 states. The Spot Market Agent Research Tool Version 2.0 (SMART II) is an investigative system that partially implements some important components of several existing ANL linear programming models, including some used by Western. SMART II does not implement a complete model of the Western utility system but it does include several salient features of this network for exploratory purposes. SMART II uses a Swarm agent-based framework. SMART II agents model bulk electric power transaction dynamics with recognition for marginal costs as well as transmission and generation constraints. SMART II uses a sparse graph of nodes and links to model the electric power spot market. The nodes represent power generators and consumers with distinct marginal decision curves and varying investment capital as well individual learning parameters. The links represent transmission lines with individual capacities taken from a range of central distribution, outlying distribution and feeder line types. The application of SMART II to electric power systems studies has produced useful results different from those often found using more traditional techniques. Use of the advanced features offered by the Swarm modeling environment simplified the creation of the SMART II model.
Vernardos, G.; Fluke, C. J.; Croton, D.; Bate, N. F.
As synoptic all-sky surveys begin to discover new multiply lensed quasars, the flow of data will enable statistical cosmological microlensing studies of sufficient size to constrain quasar accretion disk and supermassive black hole properties. In preparation for this new era, we are undertaking the GPU-Enabled, High Resolution cosmological MicroLensing parameter survey (GERLUMPH). We present here the GERLUMPH Data Release 1, which consists of 12,342 high resolution cosmological microlensing magnification maps and provides the first uniform coverage of the convergence, shear, and smooth matter fraction parameter space. We use these maps to perform a comprehensive numerical investigation of the mass-sheet degeneracy, finding excellent agreement with its predictions. We study the effect of smooth matter on microlensing induced magnification fluctuations. In particular, in the minima and saddle-point regions, fluctuations are enhanced only along the critical line, while in the maxima region they are always enhanced for high smooth matter fractions (≈0.9). We describe our approach to data management, including the use of an SQL database with a Web interface for data access and online analysis, obviating the need for individuals to download large volumes of data. In combination with existing observational databases and online applications, the GERLUMPH archive represents a fundamental component of a new microlensing eResearch cloud. Our maps and tools are publicly available at http://gerlumph.swin.edu.au/
Vernardos, G.; Fluke, C. J.; Croton, D. [Centre for Astrophysics and Supercomputing, Swinburne University of Technology, P.O. Box 218, Hawthorn, Victoria, 3122 (Australia); Bate, N. F. [Sydney Institute for Astronomy, School of Physics, A28, University of Sydney, NSW, 2006 (Australia)
As synoptic all-sky surveys begin to discover new multiply lensed quasars, the flow of data will enable statistical cosmological microlensing studies of sufficient size to constrain quasar accretion disk and supermassive black hole properties. In preparation for this new era, we are undertaking the GPU-Enabled, High Resolution cosmological MicroLensing parameter survey (GERLUMPH). We present here the GERLUMPH Data Release 1, which consists of 12,342 high resolution cosmological microlensing magnification maps and provides the first uniform coverage of the convergence, shear, and smooth matter fraction parameter space. We use these maps to perform a comprehensive numerical investigation of the mass-sheet degeneracy, finding excellent agreement with its predictions. We study the effect of smooth matter on microlensing induced magnification fluctuations. In particular, in the minima and saddle-point regions, fluctuations are enhanced only along the critical line, while in the maxima region they are always enhanced for high smooth matter fractions (≈0.9). We describe our approach to data management, including the use of an SQL database with a Web interface for data access and online analysis, obviating the need for individuals to download large volumes of data. In combination with existing observational databases and online applications, the GERLUMPH archive represents a fundamental component of a new microlensing eResearch cloud. Our maps and tools are publicly available at http://gerlumph.swin.edu.au/.
McMahan, Tracy A.; Shea, Charlotte A.; Finckenor, Miria; Ferguson, Dale
As NASA plans and implements the Vision for Space Exploration, managers, engineers, and scientists need lunar environment information that is readily available and easily accessed. For this effort, lunar environment data was compiled from a variety of missions from Apollo to more recent remote sensing missions, such as Clementine. This valuable information comes not only in the form of measurements and images but also from the observations of astronauts who have visited the Moon and people who have designed spacecraft for lunar missions. To provide a research tool that makes the voluminous lunar data more accessible, the Space Environments and Effects (SEE) Program, managed at NASA's Marshall Space Flight Center (MSFC) in Huntsville, AL, organized the data into a DVD knowledgebase: the Lunar e-Library. This searchable collection of 1100 electronic (.PDF) documents and abstracts makes it easy to find critical technical data and lessons learned from past lunar missions and exploration studies. The SEE Program began distributing the Lunar e-Library DVD in 2006. This paper describes the Lunar e-Library development process (including a description of the databases and resources used to acquire the documents) and the contents of the DVD product, demonstrates its usefulness with focused searches, and provides information on how to obtain this free resource.
King, Stephanie L
Over the years, playback experiments have helped further our understanding of the wonderful world of animal communication. They have provided fundamental insights into animal behaviour and the function of communicative signals in numerous taxa. As important as these experiments are, however, there is strong evidence to suggest that the information conveyed in a signal may only have value when presented interactively. By their very nature, signalling exchanges are interactive and therefore, an interactive playback design is a powerful tool for examining the function of such exchanges. While researchers working on frog and songbird vocal interactions have long championed interactive playback, it remains surprisingly underused across other taxa. The interactive playback approach is not limited to studies of acoustic signalling, but can be applied to other sensory modalities, including visual, chemical and electrical communication. Here, I discuss interactive playback as a potent yet underused technique in the field of animal behaviour. I present a concise review of studies that have used interactive playback thus far, describe how it can be applied, and discuss its limitations and challenges. My hope is that this review will result in more scientists applying this innovative technique to their own study subjects, as a means of furthering our understanding of the function of signalling interactions in animal communication systems. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Weber, Griffin M; Murphy, Shawn N; McMurry, Andrew J; Macfadden, Douglas; Nigrin, Daniel J; Churchill, Susanne; Kohane, Isaac S
The authors developed a prototype Shared Health Research Information Network (SHRINE) to identify the technical, regulatory, and political challenges of creating a federated query tool for clinical data repositories. Separate Institutional Review Boards (IRBs) at Harvard's three largest affiliated health centers approved use of their data, and the Harvard Medical School IRB approved building a Query Aggregator Interface that can simultaneously send queries to each hospital and display aggregate counts of the number of matching patients. Our experience creating three local repositories using the open source Informatics for Integrating Biology and the Bedside (i2b2) platform can be used as a road map for other institutions. The authors are actively working with the IRBs and regulatory groups to develop procedures that will ultimately allow investigators to obtain identified patient data and biomaterials through SHRINE. This will guide us in creating a future technical architecture that is scalable to a national level, compliant with ethical guidelines, and protective of the interests of the participating hospitals.
Sharma, Deepak; Priyadarshini, Pragya; Vrati, Sudhanshu
The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain--viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Full Text Available In the article were described issues associated with the use by scientific institutions content marketing strategy tools. This article shows the extent to which tools of modern marketing are used in the Internet communication by scientific institutions. Currently content marketing concept is accepted not only as a fashionable trend of modern marketing but above all, it is treated as an important tool to improve enough Internet message, to effectively interest to the users. A optimal selection and use content marketing tools it provides opportunities for enhancing efficiency in the reception (acceptance of the generated message.
Scott-Phillips, Thomas C
Pragmatics has historically played a relatively peripheral role in language evolution research. This is a profound mistake. Here I describe how a pragmatic perspective can inform language evolution in the most fundamental way: by making clear what the natural objects of study are, and hence what the aims of the field should be.
Full Text Available Abstract Background Personal digital assistants (PDA offer putative advantages over paper for collecting research data. However, there are no data prospectively comparing PDA and paper in the emergency department. The aim of this study was to prospectively compare the performance of PDA and paper enrollment instruments with respect to time required and errors generated. Methods We randomized consecutive patients enrolled in an ongoing prospective study to having their data recorded either on a PDA or a paper data collection instrument. For each method, we recorded the total time required for enrollment, and the time required for manual transcription (paper onto a computer database. We compared data error rates by examining missing data, nonsensical data, and errors made during the transcription of paper forms. Statistical comparisons were performed by Kruskal-Wallis and Poisson regression analyses for time and errors, respectively. Results We enrolled 68 patients (37 PDA, 31 paper. Two of 31 paper forms were not available for analysis. Total data gathering times, inclusive of transcription, were significantly less for PDA (6:13 min per patient compared to paper (9:12 min per patient; p Conclusion Using a PDA-based data collection instrument for clinical research reduces the time required for data gathering and significantly improves data integrity.
Full Text Available This article disseminates the results of a programme of detailed archaeological survey and archive research on one of Europe's most important surviving late-medieval Guild Chapels — that of the Holy Cross Guild, Stratford-upon-Avon (Warwickshire. Today the building is part of Stratford-upon-Avon's tourist trail, located directly opposite William Shakespeare's home, 'New Place', and visited by thousands of tourists every year. However, its archaeological and historical significance has been overlooked owing to the extensive restoration of the building in the 19th and 20th centuries. This destroyed evidence for an internationally significant scheme of wall paintings within the Chapel, paid for by the London Mayor and Stratford-upon-Avon merchant, Hugh Clopton, an important member of the Holy Cross Guild and the original builder of 'New Place'. The paintings also have an important connection with Stratford-upon-Avon's most famous son, William Shakespeare, whose father may have been involved in their destruction and removal during the 16th century. Research by a team of historical archaeologists and digital heritage specialists at the Department of Archaeology, University of York, has revealed the significance of the Guild Chapel through the creation of a digital model and textual paradata, which form the focus of this article. The project is ground-breaking in that it moves beyond the traditional use of digital models as virtual reconstructions of past buildings to use the model itself as a research tool through which the user can explore and validate the evidence for the scheme directly. This is achieved through the creation of a palimpsest of antiquarian drawings of the paintings, made as they were revealed during restoration works in the 19th and 20th centuries, and set within their 3-dimensional architectural context. The model allows the user to compare and contrast differences in the recording methods, iconographies and interpretations of
Malinowski, Ann Kinga; Ananth, Cande V; Catalano, Patrick; Hines, Erin P; Kirby, Russell S; Klebanoff, Mark A; Mulvihill, John J; Simhan, Hyagriv; Hamilton, Carol M; Hendershot, Tabitha P; Phillips, Michael J; Kilpatrick, Lisa A; Maiese, Deborah R; Ramos, Erin M; Wright, Rosalind J; Dolan, Siobhan M
Only through concerted and well-executed research endeavors can we gain the requisite knowledge to advance pregnancy care and have a positive impact on maternal and newborn health. Yet the heterogeneity inherent in individual studies limits our ability to compare and synthesize study results, thus impeding the capacity to draw meaningful conclusions that can be trusted to inform clinical care. The PhenX Toolkit (http://www.phenxtoolkit.org), supported since 2007 by the National Institutes of Health, is a web-based catalog of standardized protocols for measuring phenotypes and exposures relevant for clinical research. In 2016, a working group of pregnancy experts recommended 15 measures for the PhenX Toolkit that are highly relevant to pregnancy research. The working group followed the established PhenX consensus process to recommend protocols that are broadly validated, well established, nonproprietary, and have a relatively low burden for investigators and participants. The working group considered input from the pregnancy experts and the broader research community and included measures addressing the mode of conception, gestational age, fetal growth assessment, prenatal care, the mode of delivery, gestational diabetes, behavioral and mental health, and environmental exposure biomarkers. These pregnancy measures complement the existing measures for other established domains in the PhenX Toolkit, including reproductive health, anthropometrics, demographic characteristics, and alcohol, tobacco, and other substances. The preceding domains influence a woman's health during pregnancy. For each measure, the PhenX Toolkit includes data dictionaries and data collection worksheets that facilitate incorporation of the protocol into new or existing studies. The measures within the pregnancy domain offer a valuable resource to investigators and clinicians and are well poised to facilitate collaborative pregnancy research with the goal to improve patient care. To achieve this
Henry, Nancy L.
Technology and a variety of resources play an important role in students' educational lives. Vygotsky's (1987) theory of tool mediation suggests that cultural tools, such as computer software influence individuals' thinking and action. However, it is not completely understood how technology and other resources influence student action. Middle…
Smith, Des H.V.; Moehrenschlager, Axel; Christensen, Nancy; Knapik, Dwight; Gibson, Keith; Converse, Sarah J.
Worldwide, approximately 168 bird species are captive-bred for reintroduction into the wild. Programs tend to be initiated for species with a high level of endangerment. Depressed hatching success can be a problem for such programs and has been linked to artificial incubation. The need for artificial incubation is driven by the practice of multiclutching to increase egg production or by uncertainty over the incubation abilities of captive birds. There has been little attempt to determine how artificial incubation differs from bird-contact incubation. We describe a novel archive (data-logger) egg and use it to compare temperature, humidity, and egg-turning in 5 whooping crane (Grus americana) nests, 4 sandhill crane (G. canadensis) nests, and 3 models of artificial incubator; each of which are used to incubate eggs in whooping crane captive-breeding programs. Mean incubation temperature was 31.7° C for whooping cranes and 32.83° C for sandhill cranes. This is well below that of the artificial incubators (which were set based on a protocol of 37.6° C). Humidity in crane nests varied considerably, but median humidity in all 3 artificial incubators was substantially different from that in the crane nests. Two artificial incubators failed to turn the eggs in a way that mimicked crane egg-turning. Archive eggs are an effective tool for guiding the management of avian conservation breeding programs, and can be custom-made for other species. They also have potential to be applied to research on wild populations.
Benis, Arriel; Hoshen, Moshe
Outcomes research and evidence-based medical practice is being positively impacted by proliferation of healthcare databases. Modern epidemiologic studies require complex data comprehension. A new tool, DisEpi, facilitates visual exploration of epidemiological data supporting Public Health Knowledge Discovery. It provides domain-experts a compact visualization of information at the population level. In this study, DisEpi is applied to Attention-Deficit/Hyperactivity Disorder (ADHD) patients within Clalit Health Services, analyzing the socio-demographic and ADHD filled prescription data between 2006 and 2016 of 1,605,800 children aged 6 to 17 years. DisEpi's goals facilitate the identification of (1) Links between attributes and/or events, (2) Changes in these relationships over time, and (3) Clusters of population attributes for similar trends. DisEpi combines hierarchical clustering graphics and a heatmap where color shades reflect disease time-trends. In the ADHD context, DisEpi allowed the domain-expert to visually analyze a snapshot summary of data mining results. Accordingly, the domain-expert was able to efficiently identify that: (1) Relatively younger children and particularly youngest children in class are treated more often, (2) Medication incidence increased between 2006 and 2011 but then stabilized, and (3) Progression rates of medication incidence is different for each of the 3 main discovered clusters (aka: profiles) of treated children. DisEpi delivered results similar to those previously published which used classical statistical approaches. DisEpi requires minimal preparation and fewer iterations, generating results in a user-friendly format for the domain-expert. DisEpi will be wrapped as a package containing the end-to-end discovery process. Optionally, it may provide automated annotation using calendar events (such as policy changes or media interests), which can improve discovery efficiency, interpretation, and policy implementation.
Vitova, T.; Brendebach, B.; Dardenne, K.; Denecke, M. A.; Lebid, A.; Löble, M.; Rothe, J.; Batuk, O. N.; Hormes, J.; Liu, D.; Breher, F.; Geckeis, H.
High resolution X-ray emission spectroscopy (HRXES) is becoming increasingly important for our understanding of electronic and coordination structures. The combination of such information with development of quantum theoretical tools will advance our capability for predicting reactivity and physical behavior especially of 5f elements. HRXES can be used to remove lifetime broadening by registering the partial fluorescence yield emitted by the sample (i.e., recording a windowed signal from the energy dispersed fluorescence emission while varying incident photon energy), thereby yielding highly resolved X-ray absorption fine structure (XAFS) spectra. Such spectra often display resonant features not observed in conventional XAFS. The spectrometer set-up can also be used for a wide range of other experiments, for example, resonant inelastic X-ray scattering (RIXS), where bulk electron configuration information in solids, liquids and gases is obtained. Valence-selective XAFS studies, where the local structure of a selected element's valence state present in a mixture of valence states can be obtained, as well as site-selective XAFS studies, where the coordination structure of a metal bound to selected elements can be differentiated from that of all the other ligating atoms. A HRXES spectrometer has been constructed and is presently being commissioned for use at the INE-Beamline for actinide research at the synchrotron source ANKA at FZK. We present the spectrometer's compact, modular design, optimized for attaining a wide range of energies, and first test measurement results. Examples from HRXES studies of lanthanides, actinides counter parts, are also shown.
Mendoza, A. M. M.; Rastaetter, L.; Kuznetsova, M. M.; Mays, M. L.; Chulaki, A.; Shim, J. S.; MacNeice, P. J.; Taktakishvili, A.; Collado-Vega, Y. M.; Weigand, C.; Zheng, Y.; Mullinix, R.; Patel, K.; Pembroke, A. D.; Pulkkinen, A. A.; Boblitt, J. M.; Bakshi, S. S.; Tsui, T.
The Community Coordinated Modeling Center (CCMC), with the fundamental goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research, has been serving as an integral hub for over 15 years, providing invaluable resources to both space weather scientific and operational communities. CCMC has developed and provided innovative web-based point of access tools varying from: Runs-On-Request System - providing unprecedented global access to the largest collection of state-of-the-art solar and space physics models, Integrated Space Weather Analysis (iSWA) - a powerful dissemination system for space weather information, Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and Mobile apps to view space weather data anywhere to the scientific community. In addition to supporting research and performing model evaluations, CCMC also supports space science education by hosting summer students through local universities. In this poster, we will showcase CCMC's latest innovative tools and services, and CCMC's tools that revolutionized the way we do research and improve our operational space weather capabilities. CCMC's free tools and resources are all publicly available online (http://ccmc.gsfc.nasa.gov).
Duffy, Christopher; Leonard, Lorne; Shi, Yuning; Bhatt, Gopal; Hanson, Paul; Gil, Yolanda; Yu, Xuan
Using a series of recent examples and papers we explore some progress and potential for virtual (cyber-) collaboration inspired by access to high resolution, harmonized public-sector data at continental scales . The first example describes 7 meso-scale catchments in Pennsylvania, USA where the watershed is forced by climate reanalysis and IPCC future climate scenarios (Intergovernmental Panel on Climate Change). We show how existing public-sector data and community models are currently able to resolve fine-scale eco-hydrologic processes regarding wetland response to climate change . The results reveal that regional climate change is only part of the story, with large variations in flood and drought response associated with differences in terrain, physiography, landuse and/or hydrogeology. The importance of community-driven virtual testbeds are demonstrated in the context of Critical Zone Observatories, where earth scientists from around the world are organizing hydro-geophysical data and model results to explore new processes that couple hydrologic models with land-atmosphere interaction, biogeochemical weathering, carbon-nitrogen cycle, landscape evolution and ecosystem services . Critical Zone cyber-research demonstrates how data-driven model development requires a flexible computational structure where process modules are relatively easy to incorporate and where new data structures can be implemented . From the perspective of "Big-Data" the paper points out that extrapolating results from virtual observatories to catchments at continental scales, will require centralized or cloud-based cyberinfrastructure as a necessary condition for effectively sharing petabytes of data and model results . Finally we outline how innovative cyber-science is supporting earth-science learning, sharing and exploration through the use of on-line tools where hydrologists and limnologists are sharing data and models for simulating the coupled impacts of catchment
Siegel Robert S
Full Text Available Abstract Background A common limitation in guard cell signaling research is that it is difficult to obtain consistent high expression of transgenes of interest in Arabidopsis guard cells using known guard cell promoters or the constitutive 35S cauliflower mosaic virus promoter. An additional drawback of the 35S promoter is that ectopically expressing a gene throughout the organism could cause pleiotropic effects. To improve available methods for targeted gene expression in guard cells, we isolated strong guard cell promoter candidates based on new guard cell-specific microarray analyses of 23,000 genes that are made available together with this report. Results A promoter, pGC1(At1g22690, drove strong and relatively specific reporter gene expression in guard cells including GUS (beta-glucuronidase and yellow cameleon YC3.60 (GFP-based calcium FRET reporter. Reporter gene expression was weaker in immature guard cells. The expression of YC3.60 was sufficiently strong to image intracellular Ca2+ dynamics in guard cells of intact plants and resolved spontaneous calcium transients in guard cells. The GC1 promoter also mediated strong reporter expression in clustered stomata in the stomatal development mutant too-many-mouths (tmm. Furthermore, the same promoter::reporter constructs also drove guard cell specific reporter expression in tobacco, illustrating the potential of this promoter as a method for high level expression in guard cells. A serial deletion of the promoter defined a guard cell expression promoter region. In addition, anti-sense repression using pGC1 was powerful for reducing specific GFP gene expression in guard cells while expression in leaf epidermal cells was not repressed, demonstrating strong cell-type preferential gene repression. Conclusion The pGC1 promoter described here drives strong reporter expression in guard cells of Arabidopsis and tobacco plants. It provides a potent research tool for targeted guard cell expression or
Yang, Yingzhen; Costa, Alex; Leonhardt, Nathalie; Siegel, Robert S; Schroeder, Julian I
Background A common limitation in guard cell signaling research is that it is difficult to obtain consistent high expression of transgenes of interest in Arabidopsis guard cells using known guard cell promoters or the constitutive 35S cauliflower mosaic virus promoter. An additional drawback of the 35S promoter is that ectopically expressing a gene throughout the organism could cause pleiotropic effects. To improve available methods for targeted gene expression in guard cells, we isolated strong guard cell promoter candidates based on new guard cell-specific microarray analyses of 23,000 genes that are made available together with this report. Results A promoter, pGC1(At1g22690), drove strong and relatively specific reporter gene expression in guard cells including GUS (beta-glucuronidase) and yellow cameleon YC3.60 (GFP-based calcium FRET reporter). Reporter gene expression was weaker in immature guard cells. The expression of YC3.60 was sufficiently strong to image intracellular Ca2+ dynamics in guard cells of intact plants and resolved spontaneous calcium transients in guard cells. The GC1 promoter also mediated strong reporter expression in clustered stomata in the stomatal development mutant too-many-mouths (tmm). Furthermore, the same promoter::reporter constructs also drove guard cell specific reporter expression in tobacco, illustrating the potential of this promoter as a method for high level expression in guard cells. A serial deletion of the promoter defined a guard cell expression promoter region. In addition, anti-sense repression using pGC1 was powerful for reducing specific GFP gene expression in guard cells while expression in leaf epidermal cells was not repressed, demonstrating strong cell-type preferential gene repression. Conclusion The pGC1 promoter described here drives strong reporter expression in guard cells of Arabidopsis and tobacco plants. It provides a potent research tool for targeted guard cell expression or gene silencing. It is also
Altman, Eric I; Baykara, Mehmet Z; Schwarz, Udo D
Although atomic force microscopy (AFM) was rapidly adopted as a routine surface imaging apparatus after its introduction in 1986, it has not been widely used in catalysis research. The reason is that common AFM operating modes do not provide the atomic resolution required to follow catalytic processes; rather the more complex noncontact (NC) mode is needed. Thus, scanning tunneling microscopy has been the principal tool for atomic scale catalysis research. In this Account, recent developments in NC-AFM will be presented that offer significant advantages for gaining a complete atomic level view of catalysis. The main advantage of NC-AFM is that the image contrast is due to the very short-range chemical forces that are of interest in catalysis. This motivated our development of 3D-AFM, a method that yields quantitative atomic resolution images of the potential energy surfaces that govern how molecules approach, stick, diffuse, and rebound from surfaces. A variation of 3D-AFM allows the determination of forces required to push atoms and molecules on surfaces, from which diffusion barriers and variations in adsorption strength may be obtained. Pushing molecules towards each other provides access to intermolecular interaction between reaction partners. Following reaction, NC-AFM with CO-terminated tips yields textbook images of intramolecular structure that can be used to identify reaction intermediates and products. Because NC-AFM and STM contrast mechanisms are distinct, combining the two methods can produce unique insight. It is demonstrated for surface-oxidized Cu(100) that simultaneous 3D-AFM/STM yields resolution of both the Cu and O atoms. Moreover, atomic defects in the Cu sublattice lead to variations in the reactivity of the neighboring O atoms. It is shown that NC-AFM also allows a straightforward imaging of work function variations which has been used to identify defect charge states on catalytic surfaces and to map charge transfer within an individual
Zhao, Y.; Zhao, Y. L.; Shao, YW; Hu, T. J.; Zhang, Q.; Ge, X. H.
Cutting force is an important factor that affects machining accuracy, cutting vibration and tool wear. Machining condition monitoring by cutting force measurement is a key technology for intelligent manufacture. Current cutting force sensors exist problems of large volume, complex structure and poor compatibility in practical application, for these problems, a smart cutting tool is proposed in this paper for cutting force measurement. Commercial MEMS (Micro-Electro-Mechanical System) strain gauges with high sensitivity and small size are adopted as transducing element of the smart tool, and a structure optimized cutting tool is fabricated for MEMS strain gauge bonding. Static calibration results show that the developed smart cutting tool is able to measure cutting forces in both X and Y directions, and the cross-interference error is within 3%. Its general accuracy is 3.35% and 3.27% in X and Y directions, and sensitivity is 0.1 mV/N, which is very suitable for measuring small cutting forces in high speed and precision machining. The smart cutting tool is portable and reliable for practical application in CNC machine tool.
Abery, Philip; Kuys, Suzanne; Lynch, Mary; Low Choy, Nancy
To design and establish reliability of a local stroke audit tool by engaging allied health clinicians within a privately funded hospital. Design: Two-stage study involving a modified Delphi process to inform stroke audit tool development and inter-tester reliability. Allied health clinicians. A modified Delphi process to select stroke guideline recommendations for inclusion in the audit tool. Reliability study: 1 allied health representative from each discipline audited 10 clinical records with sequential admissions to acute and rehabilitation services. Recommendations were admitted to the audit tool when 70% agreement was reached, with 50% set as the reserve agreement. Inter-tester reliability was determined using intra-class correlation coefficients (ICCs) across 10 clinical records. Twenty-two participants (92% female, 50% physiotherapists, 17% occupational therapists) completed the modified Delphi process. Across 6 voting rounds, 8 recommendations reached 70% agreement and 2 reached 50% agreement. Two recommendations (nutrition/hydration; goal setting) were added to ensure representation for all disciplines. Substantial consistency across raters was established for the audit tool applied in acute stroke (ICC .71; range .48 to .90) and rehabilitation (ICC.78; range .60 to .93) services. Allied health clinicians within a privately funded hospital generally agreed in an audit process to develop a reliable stroke audit tool. Allied health clinicians agreed on stroke guideline recommendations to inform a stroke audit tool. The stroke audit tool demonstrated substantial consistency supporting future use for service development. This process, which engages local clinicians, could be adopted by other facilities to design reliable audit tools to identify local service gaps to inform changes to clinical practice. © 2018 John Wiley & Sons, Ltd.
Wiche, Oliver; Székely, Balazs; Moschner, Christin; Heilmeier, Hermann
plots was randomized and every treatment was fivefold replicated. Soil solution was collected weekly with plastic suction cups. Concentrations of trace metals in shoots of oat and soil solution were measured with ICP-MS. As a result, we found that both, concentrations of trace elements in oat plants, as well as the mobility of P and trace metals in soil solution was increased by an intercropping with white lupine. Mixed culture of oat with 11% white lupin significantly increased the concentrations of the trace nutrients Fe, Mn and Zn, as well as the concentrations of the trace metals Pb, La, Nd, Sc, Th and U in tissues of oat. Surprisingly, mixed cultures with 33 % white lupin did not significantly affect trace metal concentrations in oat, what might be the consequence of an increasing competition of roots of white lupin and oat for nutrients and trace metals. In conclusion we found that mixed cultures of white lupin with cereals might be a powerful tool for enhanced phytoremediation and phytomining. However, processes involved in the physiochemical mechanism of element uptake as affected by the oat/white lupin co-cultivation remain unknown and further studies on this topic are planned. These studies have been carried out in the framework of the PhytoGerm project, financed by the Federal Ministry of Education and Research, Germany. The authors are grateful to students and laboratory assistants contributing in the field work and sample preparation.
JMBE Production Editor
Full Text Available Correction for Sarah E. Council and Julie E. Horvath, “Tools for Citizen-Science Recruitment and Student Engagement in Your Research and in Your Classroom,” which appeared in the Journal of Microbiology & Biology Education, volume 17, number 1, March 2016, pages 38–40.
Ng, Wan; Gunstone, Richard
Investigates the use of the World Wide Web (WWW) as a research and teaching tool in promoting self-directed learning groups of 15-year-old students. Discusses the perceptions of students of the effectiveness of the WWW in assisting them with the construction of knowledge on photosynthesis and respiration. (Contains 33 references.) (Author/YDS)
Federated searching was once touted as the library world's answer to Google, but ten years since federated searching technology's inception, how does it actually compare? This study focuses on undergraduate student preferences and perceptions when doing research using both Google and a federated search tool. Students were asked about their…
Zhang, Hui; Jarjour, Andrew A.; Boyd, Amanda; Williams, Anna
Multiple sclerosis is a demyelinating disease of the central nervous system which only affects humans. This makes it difficult to study at a molecular level, and to develop and test potential therapies that may change the course of the disease. The development of therapies to promote remyelination in multiple sclerosis is a key research aim, to both aid restoration of electrical impulse conduction in nerves and provide neuroprotection, reducing disability in patients. Testing a remyelination therapy in the many and various in vivo models of multiple sclerosis is expensive in terms of time, animals and money. We report the development and characterisation of an ex vivo slice culture system using mouse brain and spinal cord, allowing investigation of myelination, demyelination and remyelination, which can be used as an initial reliable screen to select the most promising remyelination strategies. We have automated the quantification of myelin to provide a high content and moderately-high-throughput screen for testing therapies for remyelination both by endogenous and exogenous means and as an invaluable way of studying the biology of remyelination. PMID:21515259
Zhang, Hui; Jarjour, Andrew A; Boyd, Amanda; Williams, Anna
Multiple sclerosis is a demyelinating disease of the central nervous system which only affects humans. This makes it difficult to study at a molecular level, and to develop and test potential therapies that may change the course of the disease. The development of therapies to promote remyelination in multiple sclerosis is a key research aim, to both aid restoration of electrical impulse conduction in nerves and provide neuroprotection, reducing disability in patients. Testing a remyelination therapy in the many and various in vivo models of multiple sclerosis is expensive in terms of time, animals and money. We report the development and characterisation of an ex vivo slice culture system using mouse brain and spinal cord, allowing investigation of myelination, demyelination and remyelination, which can be used as an initial reliable screen to select the most promising remyelination strategies. We have automated the quantification of myelin to provide a high content and moderately-high-throughput screen for testing therapies for remyelination both by endogenous and exogenous means and as an invaluable way of studying the biology of remyelination. Copyright © 2011 Elsevier Inc. All rights reserved.
Full Text Available The developments in the culture of democracy have led to shaping of activities in politics and political parties from a marketing point of view. Therefore the concept of political marketing has become a new field of study both for academicians and professionals. The aim of this study is to investigate whether demographic and socio-cultural characteristics of voters differed in their perceptions of political trust. The data, which collected through face-to-face surveys conducted with 574 participants, were analyzed statistically using SPSS package program. According to the analysis results, hypotheses which argue that the confidence variable, which is one of the three factors that constitute the trust perceptions of voters, differs according to age groups, education levels, professional groups and political views were accepted. Also hypotheses which argue that ‘doing non-political works’ variable that is another factor, differs according to education levels, income levels, and occupational groups were accepted, while the other hypotheses were rejected. The communication variable did not differ in term of any characteristic of the participants. Given the limited work on this topic, the findings of the research which were conceptually in accordance with the previous results, show that this study has made considerable contributions to the literature.
Full Text Available The paper is dealing with one segment of broader research of universality an systematicness in application of seven basic quality tools (7QC tools, which is possible to use in different areas: power plant, process industry, government, health and tourism services. The aim of the paper was to show on practical examples that there is real possibility of application of 7QC tools. Furthermore, the research has to show to what extent are selected tools in usage and what reasons of avoiding their broader application are. The simple example of successful application of the quality tools are shown on selected company in process industry.
Powers, Christina M., E-mail: firstname.lastname@example.org [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Grieger, Khara D., E-mail: email@example.com [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States); Hendren, Christine Ogilvie, E-mail: firstname.lastname@example.org [Center for the Environmental Implications of NanoTechnology, Duke University, Durham, NC 27708 (United States); Meacham, Connie A., E-mail: email@example.com [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Gurevich, Gerald, E-mail: firstname.lastname@example.org [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Lassiter, Meredith Gooding, E-mail: email@example.com [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Money, Eric S., E-mail: firstname.lastname@example.org [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States); Lloyd, Jennifer M., E-mail: email@example.com [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States); Beaulieu, Stephen M., E-mail: firstname.lastname@example.org [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States)
Prioritizing and assessing risks associated with chemicals, industrial materials, or emerging technologies is a complex problem that benefits from the involvement of multiple stakeholder groups. For example, in the case of engineered nanomaterials (ENMs), scientific uncertainties exist that hamper environmental, health, and safety (EHS) assessments. Therefore, alternative approaches to standard EHS assessment methods have gained increased attention. The objective of this paper is to describe the application of a web-based, interactive decision support tool developed by the U.S. Environmental Protection Agency (U.S. EPA) in a pilot study on ENMs. The piloted tool implements U.S. EPA's comprehensive environmental assessment (CEA) approach to prioritize research gaps. When pursued, such research priorities can result in data that subsequently improve the scientific robustness of risk assessments and inform future risk management decisions. Pilot results suggest that the tool was useful in facilitating multi-stakeholder prioritization of research gaps. Results also provide potential improvements for subsequent applications. The outcomes of future CEAWeb applications with larger stakeholder groups may inform the development of funding opportunities for emerging materials across the scientific community (e.g., National Science Foundation Science to Achieve Results [STAR] grants, National Institutes of Health Requests for Proposals). - Highlights: • A web-based, interactive decision support tool was piloted for emerging materials. • The tool (CEAWeb) was based on an established approach to prioritize research gaps. • CEAWeb facilitates multi-stakeholder prioritization of research gaps. • We provide recommendations for future versions and applications of CEAWeb.
Good, Marjorie J; Hurley, Patricia; Woo, Kaitlin M; Szczepanek, Connie; Stewart, Teresa; Robert, Nicholas; Lyss, Alan; Gönen, Mithat; Lilenbaum, Rogerio
Clinical research program managers are regularly faced with the quandary of determining how much of a workload research staff members can manage while they balance clinical practice and still achieve clinical trial accrual goals, maintain data quality and protocol compliance, and stay within budget. A tool was developed to measure clinical trial-associated workload, to apply objective metrics toward documentation of work, and to provide clearer insight to better meet clinical research program challenges and aid in balancing staff workloads. A project was conducted to assess the feasibility and utility of using this tool in diverse research settings. Community-based research programs were recruited to collect and enter clinical trial-associated monthly workload data into a web-based tool for 6 consecutive months. Descriptive statistics were computed for self-reported program characteristics and workload data, including staff acuity scores and number of patient encounters. Fifty-one research programs that represented 30 states participated. Median staff acuity scores were highest for staff with patients enrolled in studies and receiving treatment, relative to staff with patients in follow-up status. Treatment trials typically resulted in higher median staff acuity, relative to cancer control, observational/registry, and prevention trials. Industry trials exhibited higher median staff acuity scores than trials sponsored by the National Institutes of Health/National Cancer Institute, academic institutions, or others. The results from this project demonstrate that trial-specific acuity measurement is a better measure of workload than simply counting the number of patients. The tool was shown to be feasible and useable in diverse community-based research settings. Copyright © 2016 by American Society of Clinical Oncology.
Full Text Available Background: To the best of our knowledge, a strategic approach to define the contents of structured clinical documentation tools for both clinical routine patient care and research purposes has not been reported so far, although electronic health record will become more and more structured and detailed in the future. Objective: To achieve an interdisciplinary consensus on a checklist to be considered for the preparation of disease- and situation-specific clinical documentation tools. Methods: A 2-round Delphi consensus-based process was conducted both with 19 physicians of different disciplines and 14 students from Austria, Switzerland, and Germany. Agreement was defined as 80% or more positive votes of the participants. Results: The participants agreed that a working group should be set up for the development of structured disease- or situation-specific documentation tools (97% agreement. The final checklist included 4 recommendations concerning the setup of the working group, 12 content-related recommendations, and 3 general and technical recommendations (mean agreement [standard deviation] = 97.4% [4.0%], ranging from 84.2% to 100.0%. Discussion and Conclusion: In the future, disease- and situation-specific structured documentation tools will provide an important bridge between registries and electronic health records. Clinical documentation tools defined according to this Delphi consensus-based checklist will provide data for registries while serving as high-quality data acquisition tools in routine clinical care.
Torous, John; Kiang, Mathew V; Lorme, Jeanette; Onnela, Jukka-Pekka
Background A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-qu...
translate MI into routine clinical practice. MI assays offer four significant advantages over conventional techniques used in 'in vitro' and cell culture biologic research. These advantages include assessment of whole animal phenomena, repeatability, functionality and quantification. Radiopharmacy is actually a well established discipline that supports the clinical Nuclear Medicine imaging. It grows on the basis of Radiopharmacology and other disciplines that helped it to develop and convert labeled compounds in Radiopharmaceuticals that are routinely used in clinical Nuclear Medicine. The aim of this field is to study the functionality of tissues and organs in a living organism. In the last years this discipline was the first area to grow towards the MI due to the advances in labeled probes, equipment and principally, with the intervention of professionals of different areas, moving again to the basic Radiopharmacology. In this conference it will be presented some basics in molecular sciences, with emphasis in Radiopharmacology, and the fundamentals of molecular imaging in clinical and experimental pharmacology including how imaging can be used, to assess specific molecular targets with the belief that in near future, specific imaging of such targets will allow earlier detection and characterization of disease, earlier and direct molecular assessment of treatment effects, and a more fundamental understanding of the disease processes. (authors)
Simmons, Aaron B; Bloomsburg, Samuel J; Billingslea, Samuel A; Merrill, Morgan M; Li, Shuai; Thomas, Marshall W; Fuerst, Peter G
superior colliculus. Pou4f2(Cre) provides multiple uses for the vision researcher's genetic toolkit. First, Pou4f2(Cre) is a knock-in allele that can be used to eliminate Pou4f2, resulting in depletion of RGCs. Second, expression of Cre in male germ cells makes this strain an efficient germline activator of recombination, for example, to target LoxP-flanked sequences in the whole mouse. Third, Pou4f2(Cre) efficiently targets RGCs, amacrine cells, bipolar cells, horizontal cells, and a small number of photoreceptors within the retina, as well as the visual centers in the brain. Unlike other Cre recombinase lines that target retinal neurons, no recombination was observed in Müller or other retinal glia. These properties make this Cre recombinase line a useful tool for vision researchers.
Full Text Available Although the use of the internet is expanding rapidly on college campuses, little is known about student internet use, how students perceive the reality of internet information and how successful they are in searching the internet. The aim of this project is to analyze the biochemical issues available in web pages, evaluating contents quality, trustworthiness and effectiveness. Fourteen sites were analyzed regarding to contents, presence of bibliographical references, authorship, titles responsibility and adequacy to target public. The great majority did not mention bibliographic references and target public. Less than half of the researched sites divulged names and/or graduation status of information providers. Some sites contained critical conceptual errors, such as: participation of H2O in the photosynthesis dark phase, carnivore animals feeding only on herbivores, the overall equation of photosynthesis with errors, NADH2 instead NAD+, etc. Half of them presented identical texts and figures. None of the analyzed sites was thus considered excellent. Our data strengthen the need for rigorous evaluation concerning to educational research of biochemical themes on the web.
Shukla, Vaibhav; Varghese, Vinay Koshy; Kabekkodu, Shama Prasada; Mallya, Sandeep; Satyamoorthy, Kapaettu
Since the discovery of microRNAs (miRNAs), a class of noncoding RNAs that regulate the gene expression posttranscriptionally in sequence-specific manner, there has been a release of number of tools useful for both basic and advanced applications. This is because of the significance of miRNAs in many pathophysiological conditions including cancer. Numerous bioinformatics tools that have been developed for miRNA analysis have their utility for detection, expression, function, target prediction and many other related features. This review provides a comprehensive assessment of web-based tools for the miRNA analysis that does not require prior knowledge of any computing languages. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please email: email@example.com.
Campbell, A. Malcolm; Eckdahl, Todd; Cronk, Brian; Andresen, Corinne; Frederick, Paul; Huckuntod, Samantha; Shinneman, Claire; Wacker, Annie; Yuan, Jason
The "Vision and Change" report recommended genuine research experiences for undergraduate biology students. Authentic research improves science education, increases the number of scientifically literate citizens, and encourages students to pursue research. Synthetic biology is well suited for undergraduate research and is a growing area…
Maor, Dorit; Ensor, Jason D.; Fraser, Barry J.
Supervision of doctoral students needs to be improved to increase completion rates, reduce attrition rates (estimated to be at 25% or more) and improve quality of research. The current literature review aimed to explore the contribution that technology can make to higher degree research supervision. The articles selected included empirical studies…
Charged particle activation analysis based on the bombardment with 15MeV protons from cyclotron was used to study the friction wearing at the zone of contacts in cutting tools, roller bearings and gear teeth. The radioactivity of resulting isotopes such as Co-56, Co-58, Re-183 serves as a measure of the mass changes on the surface tools. The method is suitable for studying the parameters effecting wearing processes and the role of cutting fluid, and also to envisage the economic factors in production planning
Gualda, G. A.; Ghiorso, M. S.
The thermodynamic modeling software MELTS (and its derivatives) is a powerful and much utilized tool for investigating crystallization and melting in natural magmatic systems. Rhyolite-MELTS (Gualda et al. 2012, J. Petrol. 53:875-890) is a recent recalibration of MELTS aimed at better capturing the evolution of magmas present in the upper crust (up to ~400 MPa pressure). Currently, most users of rhyolite-MELTS rely on a graphical user interface (GUI), which can be run on UNIX/LINUX and Mac OS X computers. While the interface is powerful and flexible, it can be somewhat cumbersome for the novice and the output is in the form of text files that need to be processed offline. This situation is probably the main reason why MELTS - despite great potential - has not been used more frequently for teaching purposes. We are currently developing an alternative GUI for rhyolite-MELTS using web services consumed by a VBA backend in Microsoft Excel©. The goal is to create a much more interactive tool, that is easy to use that can be made available to a widespread audience, and that will be useful for both research and teaching. The interface is contained within a macro-enabled workbook, which includes editable cells where the user can insert the model input information. Interface buttons initiate computations that are executed on a central server at OFM Research in Seattle (WA). Results of simple calculations are shown immediately within the interface itself. For instance, a user can very rapidly determine the temperature at which a magma of a given composition is completely molten (i.e. find the liquidus); or determine which phases are present, in what abundances, their compositions, and their physical properties (e.g. density, viscosity) at any given combination of temperature, pressure and oxygen fugacity. We expect that using the interface in this mode will greatly facilitate building intuition about magmas and their properties. It is also possible to combine a sequence of
Snilstveit, Birte; Vojtkova, Martina; Bhavsar, Ami; Gaarder, Marie
Evidence-gap maps present a new addition to the tools available to support evidence-informed policy making. Evidence-gap maps are thematic evidence collections covering a range of issues such as maternal health, HIV/AIDS, and agriculture. They present a visual overview of existing systematic reviews or impact evaluations in a sector or subsector, schematically representing the types of int...
Jongeling, R.M.; Datta, S.; Serebrenik, A.; Koschke, R.; Krinke, J.; Robillard, M.
Recent years have seen an increasing attention to social aspects of software engineering, including studies of emotions and sentiments experienced and expressed by the software developers. Most of these studies reuse existing sentiment analysis tools such as SentiStrength and NLTK. However, these
Many of us nowadays invest significant amounts of time in sharing our activities and opinions with friends and family via social networking tools such as Facebook, Twitter or other related websites. However, despite the availability of many platforms for scientists to connect and...
Kuru Cetin, Saadet
In this study, in-class lesson observations were made with volunteer teachers working in primary and secondary schools using alternative observation tools regarding the scope of contemporary educational supervision. The study took place during the fall and spring semesters of the 2015-2016 and 2016-2017 academic years and the class observations…
Podhora, A.; Helming, K.; Adenauer, L.; Heckelei, T.; Kautto, P.; Reidsma, P.; Rennings, K.; Turnpenny, J.; Jansen, J.M.L.
Since 2002, the European Commission has employed the instrument of ex-ante impact assessments (IA) to help focus its policy-making process on implementing sustainable development. Scientific tools should play an essential role of providing the evidence base to assess the impacts of alternative
Verhagen, Evert; Voogt, Nelly; Bruinsma, Anja; Finch, Caroline F
Evidence of effectiveness does not equal successful implementation. To progress the field, practical tools are needed to bridge the gap between research and practice and to truly unite effectiveness and implementation evidence. This paper describes the Knowledge Transfer Scheme integrating existing implementation research frameworks into a tool which has been developed specifically to bridge the gap between knowledge derived from research on the one side and evidence-based usable information and tools for practice on the other.
Wallis, Selina; Cole, Donald C; Gaye, Oumar; Mmbaga, Blandina T; Mwapasa, Victor; Tagbor, Harry; Bates, Imelda
Research is key to achieving global development goals. Our objectives were to develop and test an evidence-informed process for assessing health research management and support systems (RMSS) in four African universities and for tracking interventions to address capacity gaps. Four African universities. 83 university staff and students from 11 cadres. A literature-informed 'benchmark' was developed and used to itemise all components of a university's health RMSS. Data on all components were collected during site visits to four African universities using interview guides, document reviews and facilities observation guides. Gaps in RMSS capacity were identified against the benchmark and institutional action plans developed to remedy gaps. Progress against indicators was tracked over 15 months and common challenges and successes identified. Common gaps in operational health research capacity included no accessible research strategy, a lack of research e-tracking capability and inadequate quality checks for proposal submissions and contracts. Feedback indicated that the capacity assessment was comprehensive and generated practical actions, several of which were no-cost. Regular follow-up helped to maintain focus on activities to strengthen health research capacity in the face of challenges. Identification of each institutions' strengths and weaknesses against an evidence-informed benchmark enabled them to identify gaps in in their operational health research systems, to develop prioritised action plans, to justify resource requests to fulfil the plans and to track progress in strengthening RMSS. Use of a standard benchmark, approach and tools enabled comparisons across institutions which has accelerated production of evidence about the science of research capacity strengthening. The tools could be used by institutions seeking to understand their strengths and to address gaps in research capacity. Research capacity gaps that were common to several institutions could be
Poster presented at the Research Bazaar 2015 at Melbourne University, Australia. Conference attendees were asked to share an overview of their project and the digital platforms they used in their research.
Dahl, Jan Erik
In the studied master's course, students participated both as research objects in a digital annotation experiment and as critical investigators of this technology in their semester projects. The students' role paralleled the researcher's role, opening an opportunity for researcher-student co-learning within what is often referred to as…
Sturzenegger, Susi; Johnsson, Kai; Riezman, Howard
Funded by the Swiss National Science Foundation to promote cutting edge research as well as the advancement of young researchers and women, technology transfer, outreach and education, the NCCR (Swiss National Centre of Competence in Research) Chemical Biology is co-led by Howard Riezman, University of Geneva and Kai Johnsson, École Polytechnique Fédérale de Lausanne (EPFL).
Highlights: • A GUI-based intuitive tool for data format analysis is presented. • Data can be viewed in any data types specified by the user in real time. • Analyzed formats are saved and reused as templates for other data of the same forms. • Users can easily extract contents in any forms by writing a simple script file. • The tool would be useful for exchanging data in collaborative fusion researches. - Abstract: An intuitive tool with graphical user interface (GUI) for analyzing formats and extracting contents of binary data in fusion research is presented. Users can examine structures of binary data at arbitrary addresses by selecting their type from a list of radio buttons in the data inspection window and checking their representations instantly on the computer screen. The result of analysis is saved in a file which contains the information such as name, data type, start address, and array size of the data. If the array size of some data depends on others that appear prior to the former and if the users specify their relation in the inspection window, the resultant file can also be used as a format template for the same series of data. By writing a simple script, the users can extract the contents of data either to a text or binary file in the format of their preference. As a real-life example, the tool is applied to the MHD equilibrium data at JT-60U, where poloidal flux data are extracted and converted to a format suitable for contour plotting in other data visualization program. The tool would be useful in collaborative fusion researches for exchanging relatively small-size data, which don’t fit in well with the standard routine processes
Jessani, Nasreen; Lewy, Daniela; Ekirapa-Kiracho, Elizabeth; Bennett, Sara
Despite significant investments in health systems research (HSR) capacity development, there is a dearth of information regarding how to assess HSR capacity. An alliance of schools of public health (SPHs) in East and Central Africa developed a tool for the self-assessment of HSR capacity with the aim of producing institutional capacity development plans. Between June and November 2011, seven SPHs across the Democratic Republic of Congo, Ethiopia, Kenya, Rwanda, Tanzania, and Uganda implemented this co-created tool. The objectives of the institutional assessments were to assess existing capacities for HSR and to develop capacity development plans to address prioritized gaps. A mixed-method approach was employed consisting of document analysis, self-assessment questionnaires, in-depth interviews, and institutional dialogues aimed at capturing individual perceptions of institutional leadership, collective HSR skills, knowledge translation, and faculty incentives to engage in HSR. Implementation strategies for the capacity assessment varied across the SPHs. This paper reports findings from semi-structured interviews with focal persons from each SPH, to reflect on the process used at each SPH to execute the institutional assessments as well as the perceived strengths and weaknesses of the assessment process. The assessment tool was robust enough to be utilized in its entirety across all seven SPHs resulting in a thorough HSR capacity assessment and a capacity development plan for each SPH. Successful implementation of the capacity assessment exercises depended on four factors: (i) support from senior leadership and collaborators, (ii) a common understanding of HSR, (iii) adequate human and financial resources for the exercise, and (iv) availability of data. Methods of extracting information from the results of the assessments, however, were tailored to the unique objectives of each SPH. This institutional HSR capacity assessment tool and the process for its utilization
Cloutier, Catherine; Locat, Jacques; Mayers, Mélanie; Noël, François; Turmel, Dominique; Jacob, Chantal; Dorval, Pierre; Bossé, François; Gionet, Pierre; Jaboyedoff, Michel
Rockfall is a significant hazard along linear infrastructures due to the presence of natural and man-made rock slopes. Knowing where the problematic rockfalls source areas are is of primary importance to properly manage and mitigate the risk associated to rockfall along linear infrastructures. The aim of the ParaChute research project is to integrate various technologies into a workflow for rockfall characterization for such infrastructures, using a 220 km-long railroad as the study site which is located on Québec's North Shore, Canada. The objectives of this 3-year project which started in 2014 are: (1) to optimize the use of terrestrial, mobile and airborne laser scanners data into terrain analysis, structural geology analysis and rockfall susceptibility rating, (2) to further develop the use of unmanned aerial vehicles (UAV) for photogrammetry applied to rock cliff characterization, and (3) to integrate rockfall simulation studies into a rock slope classification system similar to the Rockfall Hazard Rating System. Firstly, based on laser scanner data and aerial photographs, the morpho-structural features of the terrain (genetic material, landform, drainage, etc.) are mapped. The result can be used to assess all types of mass movements. Secondly, to guide field work and decrease uncertainty of various parameters, systematic rockfall simulations and a first structural analysis are made from point clouds acquired by mobile and airborne laser scanner. The simulation results are used to recognize the rock slopes that have potentially problematic rockfall paths, meaning they could reach the linear infrastructure. Other rock slopes are not included in the inventory. Field work is carried out to validate and complete the rock slopes characterization previously made from remote sensing technique. Because some or parts of cliffs are not visible or accessible from the railroad, we are currently developing the use of photogrammetry by UAV in order to complete the
Full Text Available Introduction: Cerebral autoregulation (CAR, the ability of the human body to maintain cerebral blood flow (CBF in a wide range of perfusion pressures, can be calculated by describing the relation between arterial blood pressure (ABP and cerebral oxygen saturation measured by near-infrared spectroscopy (NIRS. In literature, disturbed CAR is described in different patient groups, using multiple measurement techniques and mathematical models. Furthermore, it is unclear to what extent cerebral pathology and outcome can be explained by impaired CAR.Aim and methods: In order to summarize CAR studies using NIRS in neonates, a systematic review was performed in the PUBMED and EMBASE database. To provide a general overview of the clinical framework used to study CAR, the different preprocessing methods and mathematical models are described and explained. Furthermore, patient characteristics, definition of impaired CAR and the outcome according to this definition is described organized for the different patient groups.Results: Forty-six articles were included in this review. Four patient groups were established: preterm infants during the transitional period, neonates receiving specific medication/treatment, neonates with congenital heart disease and neonates with hypoxic-ischemic encephalopathy (HIE treated with therapeutic hypothermia. Correlation, coherence and transfer function (TF gain are the mathematical models most frequently used to describe CAR. The definition of impaired CAR is depending on the mathematical model used. The incidence of intraventricular hemorrhage in preterm infants is the outcome variable most frequently correlated with impaired CAR. Hypotension, disease severity, dopamine treatment, injury on magnetic resonance imaging (MRI and long term outcome are associated with impaired CAR. Prospective interventional studies are lacking in all research areas.Discussion and conclusion: NIRS derived CAR measurement is an important research
Thewissen, Liesbeth; Caicedo, Alexander; Lemmers, Petra; Van Bel, Frank; Van Huffel, Sabine; Naulaers, Gunnar
Introduction: Cerebral autoregulation (CAR), the ability of the human body to maintain cerebral blood flow (CBF) in a wide range of perfusion pressures, can be calculated by describing the relation between arterial blood pressure (ABP) and cerebral oxygen saturation measured by near-infrared spectroscopy (NIRS). In literature, disturbed CAR is described in different patient groups, using multiple measurement techniques and mathematical models. Furthermore, it is unclear to what extent cerebral pathology and outcome can be explained by impaired CAR. Aim and methods: In order to summarize CAR studies using NIRS in neonates, a systematic review was performed in the PUBMED and EMBASE database. To provide a general overview of the clinical framework used to study CAR, the different preprocessing methods and mathematical models are described and explained. Furthermore, patient characteristics, definition of impaired CAR and the outcome according to this definition is described organized for the different patient groups. Results: Forty-six articles were included in this review. Four patient groups were established: preterm infants during the transitional period, neonates receiving specific medication/treatment, neonates with congenital heart disease and neonates with hypoxic-ischemic encephalopathy (HIE) treated with therapeutic hypothermia. Correlation, coherence and transfer function (TF) gain are the mathematical models most frequently used to describe CAR. The definition of impaired CAR is depending on the mathematical model used. The incidence of intraventricular hemorrhage in preterm infants is the outcome variable most frequently correlated with impaired CAR. Hypotension, disease severity, dopamine treatment, injury on magnetic resonance imaging (MRI) and long term outcome are associated with impaired CAR. Prospective interventional studies are lacking in all research areas. Discussion and conclusion: NIRS derived CAR measurement is an important research tool to
Naiane Carvalho Wendt
Full Text Available The genogram is a graphical representation of a family and has been used in several contexts. This paper aims to highlight the genogram use relevance in qualitative research and to propose an application and analysis procedure. The information gathered by the genogram may include genetics, medical, social, behavioral, relational and cultural aspects that denote the family structure and its configuration giving indicia of its functioning and dynamics. It is proposed that the inquiries, comments and verbalizations held by the researcher during the application of the tool are classified, according to the pre-determined categories system and that the data obtained are submitted to graphical, clinical and discourse analysis for later calculus of judges' agreement.
Full Text Available O presente artigo focaliza episódios históricos relacionados à pesquisa médica acerca da febre amarela (1881-1903, buscando discutir (a a influência que os fatores econômicos, sociais e políticos exercem sobre a pesquisa científica; (b o caráter coletivo, controvertido e não-linear do processo de produção de conhecimentos na ciência; (c a natureza arbitrária dos conhecimentos científicos, no sentido de que representam "formas de ver", e não são perenes ou elaborados apenas sobre bases racionais; (d o papel pouco cabal desempenhado pelas demonstrações experimentais, que não se mostram "irrefutáveis"; e (e o papel desempenhado pelos paradigmas, que conduzem não apenas a caminhos frutíferos, mas também a becos sem saída. O intuito é proporcionar subsídios que sejam úteis tanto aos pesquisadores como aos professores que atuam na área do Ensino de Ciências.This paper focuses on historical episodes related to medical research concerning yellow fever (1881-1903, and attempts to discuss (a the influence that economic, social and political factors exert on scientific research; (b the collective, polemical and nonlinear nature of the process of production of knowledge in science; (c the arbitrary characteristic of scientific knowledge, in that it represents "a certain point of view", and it"s not perennial or elaborated exclusively on objective grounds; (d the limited role played by experimental demonstrations, that are not "irrefutable"; and finally (e the role played by paradigms, that lead not only to fruitful ways, but also to dead-ends. The intention is to provide suggestionss that are useful both to the science education researchers and teachers.
Kademani, B. S.; Vijai Kumar, *
This paper highlights the information explosion, the need for bibliographic control, the need for information retrieval tools. Explains the emergence of Citation Index, concept of citation indexing, reasons for citing, its structure (print and electronic versions of Science citation Index and Social Science Citation Index ), and application of citation index. It also discusses the search effectiveness, factors taken into consideration for coverage of journals in citation indexes, Journal Cita...
Trevino, Victor; Falciani, Francesco; Barrera-Saldaña, Hugo A
Among the many benefits of the Human Genome Project are new and powerful tools such as the genome-wide hybridization devices referred to as microarrays. Initially designed to measure gene transcriptional levels, microarray technologies are now used for comparing other genome features among individuals and their tissues and cells. Results provide valuable information on disease subcategories, disease prognosis, and treatment outcome. Likewise, they reveal differences in genetic makeup, regulat...
Full Text Available Since the end of the 19th century the Calabria region in southern Italy has been known for an abundance of grooved stone axes and hammers used during late prehistory. These artefacts are characterized by a wide and often pronounced groove in the middle of the implement, thought to have aided securing the head to a wooden haft. Their widespread presence is known both in prehistoric archaeological literature and in the archaeological collections of various regional and extra-regional museums. At first, scholars did not relate these tools to the rich Calabrian ore deposits and to possible ancient mining activities; they were regarded simply as a variant of ground lithic industry of Neolithic tradition. However, between 1997 and 2012, about 50 tools were discovered in the prehistoric mine of Grotta della Monaca in northern Calabria where there are outcrops of copper and iron ore. This allowed us to recognize their specific mining value and to consider them as a sort of “guide fossil” for the identification of ancient mining districts. This paper presents the results of a study involving over 150 tools from the entire region, effectively demonstrating an almost perfect co-occurrence of grooved axes and hammers with areas rich in mineral resources, especially metalliferous ores.
Schoop, Eric; Kriaučiūnienė, Roma; Brundzaitė, Rasa
This article analyses the needs and possibilities to educate new type of virtual collaboration skills for the university students, who are currently studying in business and information systems area. We investigate the possibility to incorporate problem-based group learning and computer supported tools into university curricula. The empirical research results are presented, which summarize experiences of using the virtual collaborative learning (VCL) environment, provided by Business informat...
Wekerle, Christine; Vakili, Negar; Stewart, Sherry H; Black, Tara
Researchers in violence prevention areas seek to disseminate work for impact to practice and policy. Knowledge transfer, exchange, and mobilization are common terms for research knowledge utilization where public communication platforms are playing an increasing role, having unique capacity to connect stakeholders in advocacy and lived experience, academia, non-governmental organizations, government-supported organizations, such as child welfare, and research funding bodies. Social networking platforms provide a communication intervention opportunity to test the effectiveness of the research reach. A Canadian Institutes of Health Research- funded team grant in boys' and men's health, focusing on sexual violence (SV) victimization, health, and resilience undertook an evaluation to examine whether a strategic approach involving a cadre of SV experts (n = 46) and their research increased engagement. Using a unique identifier (#CIHRTeamSV) content was shared on social media (Twitter) within an ABABAB experimental monthly format (A = no sharing; B = sharing content), following a baseline entry of researchers. Active Twitter engagement lead to increases in the number of individuals' profile views, article downloads, and citations. These findings encourage further research into the utility of social media for disseminating sexual violence research, and that social media has developed as a forum for evidence-based conversation on sensitive topics of public health import. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.
Advanced light sources peer into matter at the atomic and molecular scales, with applications ranging from physics, chemistry, materials science, and advanced energy research, to biology and medicine.
Hors, Cora; Goldberg, Anna Carla; Almeida, Ederson Haroldo Pereira de; Babio Júnior, Fernando Galan; Rizzo, Luiz Vicente
Introduce a program for the management of scientific research in a General Hospital employing the business management tools Lean Six Sigma and PMBOK for project management in this area. The Lean Six Sigma methodology was used to improve the management of the institution's scientific research through a specific tool (DMAIC) for identification, implementation and posterior analysis based on PMBOK practices of the solutions found. We present our solutions for the management of institutional research projects at the Sociedade Beneficente Israelita Brasileira Albert Einstein. The solutions were classified into four headings: people, processes, systems and organizational culture. A preliminary analysis of these solutions showed them to be completely or partially compliant to the processes described in the PMBOK Guide. In this post facto study, we verified that the solutions drawn from a project using Lean Six Sigma methodology and based on PMBOK enabled the improvement of our processes dealing with the management of scientific research carried out in the institution and constitutes a model to contribute to the search of innovative science management solutions by other institutions dealing with scientific research in Brazil.
Simeonov, Valentin; Dinoev, Todor; Serikov, Ilya; Froidevaux, Martin; Bartlome, Marcel; Calpini, Bertrand; Bobrovnikov, Sergei; Ristori, Pablo; van den Bergh, Hubert; Parlange, Marc; Archinov, Yury
The talk will present the concept and observation results of three advanced lidar systems developed recently at the Swiss federal Institute of Technology- Lausanne (EPFL) Switzerland. Two of the systems are Raman lidars for simultaneous water vapor, temperature and aerosol observations and the third one is an ozone UV DIAL system. The Ranan lidars use vibrational water vapor and nitrogen signals to derive water vapor mixing ratio and temperature, aerosol extinction and backscatter are measured using pure-rotational Raman and elastic signals. The first Raman lidar (RALMO) is a fully automated, water vapor /temperature/aerosol lidar developed for operational use by the Swiss meteorological office (MeteoSiss). The lidar supplies water vapor mixing ratio and temperature plus aerosol extinction and backscatter coefficients at 355 nm. The operational range of the lidar is 100-7000 m (night time) and 100- 5000 m (daytime) with time resolution of 30 min. The spatial resolution varies with height from 25 to 300 m in order to maintain the maximum measurement error of 10%. The system is designed to provide long-term database with minimal instrument-induced variations in time of the measured parameters. The lidar has been in regular operation in the main aerological station of Meteoswiss- Payerne since September 2008. The second Raman lidar is a new generation, solar-blind system with an operational range 10-500 m and high spatial (1.5 m) and temporal (1 s) resolutions designed for simultaneous humidity, temperature, and aerosol measurements in the lower atmosphere. To maintain the measurement accuracy while operating with fixed spatial and temporal resolution, the receiver is designed to provide lower than ten dynamic range of the signals within the distance range of the lidar. The lidar has 360° azimuth and 240°elevation scanning ability. The lidar was used in two field campaigns aiming to study the structure of the lower atmosphere over complex terrains and, in particular
complete the Master with a seminar: Nuclear Power Plants, and a Thesis. In the frame of the academic plan, multiple activities are organized related to research reactors and also to nuclear power plants. Since the very beginning the performance of selected experiments in a nuclear reactor was recognized as an extraordinary tool to give the students an insight in the principal phenomena associated with the chain reaction and the related engineering problems. This experiments have an intrinsic elevated cost, associated with the relevance of the installation and with the specialized personnel involved. CNEA provides the career with this educational instrument through the Ra-1 and RA-3 reactors located at Constituyentes and Ezeiza Atomic Centers respectively. Various activities are under way but the most established, in the Reactor Physics Course, is the estimation of kinetic parameters in RA-1 reactor. The practice includes three different experiments: Approach to critical and calibration of control rods by the compensation method: Starting in a subcritical state with source the calibration of control rod B1 vs B2 is done by introduction of the first and withdrawal of the second. The methods used are based on the Point Kinetic Model; Measurement of control rods effectivity by the rod-drop method: Separate Rod Drop of rods B1 B2 B3 of the overall ensemble B1 B2 B3 B4 and total scram starting with three withdrawn and one partially inserted, is the procedure followed to estimate the reactivity worth of B1 B2 B3 and scram. The Point Kinetic Model and the Modal Kinetic Model are used; Reactor noise technique for the estimation of reactor parameters: α and Λ. The kinetic parameters are estimated assuring that the Point Kinetic Model is valid (detection chambers near to the core), that the fluctuation of the fission density is the dominant source of the correlated part of neutron noise (measurement at low power, <10kw), the dominance of the fundamental armonic (simultaneous use of
McCormick, Tyler H.; Lee, Hedwig; Cesare, Nina; Shojaie, Ali; Spiro, Emma S.
Despite recent and growing interest in using Twitter to examine human behavior and attitudes, there is still significant room for growth regarding the ability to leverage Twitter data for social science research. In particular, gleaning demographic information about Twitter users--a key component of much social science research--remains a…
Sools, Anna Maria
In qualitative health research many researchers use a narrative approach to study lay health concepts and experiences. In this article, I explore the theoretical linkages between the concepts narrative and health, which are used in a variety of ways. The article builds on previous work that
Carver, Cynthia L.; Klein, C. Suzanne
This paper introduces the use of action research to examine the content and outcomes of university-based leadership preparation programs. Using examples drawn from an ongoing action research project with candidates in a master's level principal preparation program, we demonstrate how the collection and analysis of candidate's written reflections,…
Silva, Alcino J.; Müller, Klaus-Robert
The sheer volume and complexity of publications in the biological sciences are straining traditional approaches to research planning. Nowhere is this problem more serious than in molecular and cellular cognition, since in this neuroscience field, researchers routinely use approaches and information from a variety of areas in neuroscience and other…
NASA's strategic goals include advancing knowledge and opportunity in space and improving life on Earth. We support these goals through extensive programs in space and Earth science research accomplished via space-based missions and research funding. NASA's "system" is configured to conduct science using (1) in-house personnel and (2) grants, contracts, and agreements with external entities (academia, industry, international space agencies.
Full Text Available Many undergraduate laboratories are, too often, little more than an exercise in “cooking” where students are instructed step-by-step what to add, mix, and, most unfortunately, expect as an outcome. Although the shortcomings of “cookbook” laboratories are well known, they are considerably easier to manage than the more desirable inquiry-based laboratories. Thus the ability to quickly access, share, sort, and analyze research data would make a significant contribution towards the feasibility of teaching/mentoring large numbers of inexperienced students in an inquiry-based research environment, as well as facilitating research collaborations among students. Herein we report on a software tool (MicroTracker designed to address the educational problems that we experienced with inquiry-based research education due to constraints on data management and accessibility.
Ogao, Patrick J
Abstract Background Ever since Dr. John Snow (1813–1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed...
Qualitative evaluation of the implementation of the Interdisciplinary Management Tool: a reflective tool to enhance interdisciplinary teamwork using Structured, Facilitated Action Research for Implementation.
Nancarrow, Susan A; Smith, Tony; Ariss, Steven; Enderby, Pamela M
Reflective practice is used increasingly to enhance team functioning and service effectiveness; however, there is little evidence of its use in interdisciplinary teams. This paper presents the qualitative evaluation of the Interdisciplinary Management Tool (IMT), an evidence-based change tool designed to enhance interdisciplinary teamwork through structured team reflection. The IMT incorporates three components: an evidence-based resource guide; a reflective implementation framework based on Structured, Facilitated Action Research for Implementation methodology; and formative and summative evaluation components. The IMT was implemented with intermediate care teams supported by independent facilitators in England. Each intervention lasted 6 months and was evaluated over a 12-month period. Data sources include interviews, a focus group with facilitators, questionnaires completed by team members and documentary feedback from structured team reports. Data were analysed qualitatively using the Framework approach. The IMT was implemented with 10 teams, including 253 staff from more than 10 different disciplines. Team challenges included lack of clear vision; communication issues; limited career progression opportunities; inefficient resource use; need for role clarity and service development. The IMT successfully engaged staff in the change process, and resulted in teams developing creative strategies to address the issues identified. Participants valued dedicated time to focus on the processes of team functioning; however, some were uncomfortable with a focus on teamwork at the expense of delivering direct patient care. The IMT is a relatively low-cost, structured, reflective way to enhance team function. It empowers individuals to understand and value their own, and others' roles and responsibilities within the team; identify barriers to effective teamwork, and develop and implement appropriate solutions to these. To be successful, teams need protected time to take
Atkinson, Nancy L; Massett, Holly A; Mylks, Christy; McCormack, Lauren A; Kish-Doto, Julia; Hesse, Bradford W; Wang, Min Qi
Informatics applications have the potential to improve participation in clinical trials, but their design must be based on user-centered research. This research used a fully counterbalanced experimental design to investigate the effect of changes made to the original version of a website, http://BreastCancerTrials.org/, and confirm that the revised version addressed and reinforced patients' needs and expectations. Participants included women who had received a breast cancer diagnosis within the last 5 years (N=77). They were randomized into two groups: one group used and reviewed the original version first followed by the redesigned version, and the other group used and reviewed them in reverse order. The study used both quantitative and qualitative measures. During use, participants' click paths and general reactions were observed. After use, participants were asked to answer survey items and open-ended questions to indicate their reactions and which version they preferred and met their needs and expectations better. Overall, the revised version of the site was preferred and perceived to be clearer, easier to navigate, more trustworthy and credible, and more private and safe overall. However, users who viewed the original version last had similar attitudes toward both versions. By applying research findings to the redesign of a website for clinical trial searching, it was possible to re-engineer the interface to better support patients' decisions to participate in clinical trials. The mechanisms of action in this case appeared to revolve around creating an environment that supported a sense of personal control and decisional autonomy.
Reuter, Katja; Ukpolo, Francis; Ward, Edward; Wilson, Melissa L; Angyan, Praveen
Background Understanding the relationship between organizational context and research utilization is key to reducing the research-practice gap in health care. This is particularly true in the residential long term care (LTC) setting where relatively little work has examined the influence of context on research implementation. Reliable, valid measures and tools are a prerequisite for studying organizational context and research utilization. Few such tools exist in German. We thus translated three such tools (the Alberta Context Tool and two measures of research use) into German for use in German residential LTC. We point out challenges and strategies for their solution unique to German residential LTC, and demonstrate how resolving specific challenges in the translation of the health care aide instrument version streamlined the translation process of versions for registered nurses, allied health providers, practice specialists, and managers. Methods Our translation methods were based on best practices and included two independent forward translations, reconciliation of the forward translations, expert panel discussions, two independent back translations, reconciliation of the back translations, back translation review, and cognitive debriefing. Results We categorized the challenges in this translation process into seven categories: (1) differing professional education of Canadian and German care providers, (2) risk that German translations would become grammatically complex, (3) wordings at risk of being misunderstood, (4) phrases/idioms non-existent in German, (5) lack of corresponding German words, (6) limited comprehensibility of corresponding German words, and (7) target persons’ unfamiliarity with activities detailed in survey items. Examples of each challenge are described with strategies that we used to manage the challenge. Conclusion Translating an existing instrument is complex and time-consuming, but a rigorous approach is necessary to obtain instrument
Xu Rattanasone, Nan; Davies, Benjamin; Schembri, Tamara; Andronos, Fabia; Demuth, Katherine
Learning about what young children with limited spoken language know about the grammar of their language is extremely challenging. Researchers have traditionally used looking behavior as a measure of language processing and to infer what overt choices children might make. However, these methods are expensive to setup, require specialized training, are time intensive for data analysis and can have considerable dropout rates. For these reasons, we have developed a forced choice task delivered on an iPad based on our eye-tracking studies with English monolinguals (Davies et al., 2016, under review). Using the iPad we investigated 3- and 4-year-olds' understanding of the English plural in preschool centers. The primary aim of the study was to provide evidence for the usefulness of the iPad as a language research tool. We evaluated the usefulness of the iPad with second language (L2) learning children who have limited L2 language skills. Studies with school aged Chinese-speaking children show below native performance on English inflectional morphology despite 5-6 years of immersion (Jia, 2003; Jia and Fuse, 2007; Paradis et al., 2016). However, it is unclear whether this is specific only to children who speak Chinese as their first language (L1) or if younger preschoolers will also show similar challenges. We tested three groups of preschoolers with different L1s (English, Chinese, and other languages). L1 Chinese children's performance was below both English monolinguals and children speaking Other L1 languages, providing evidence that English inflections are specifically challenging for Chinese-speaking children. The results provide further evidence to support previous eye-tracking findings with monolinguals and studies with older bilinguals. The study provides evidence for the usefulness of iPads as research tool for studying language acquisition. Implications for future application of the iPad as a teaching and intervention tool, and limitations for the method, are
Brevik, Eric C.; Lindbo, David L.; Belcher, Christopher
Several studies crossing numerous disciplinary boundaries have demonstrated that undergraduate students benefit from research experiences. These benefits include personal and intellectual development, more and closer contact with faculty, the use of active learning techniques, the creation of high expectations, the development of creative and problem-solving skills, and the development of greater independence and intrinsic motivation to learn. The discipline also gains in that studies show undergraduates who engage in research experiences are more likely to remain science majors and finish their degree program. Research experiences come as close as possible to allowing undergraduates to experience what it is like to be an academic or research member of their profession working to advance their discipline, therefore enhancing their professional socialization into their chosen field. If the goals achieved by undergraduate research include introducing these students to the advancement of their chosen field, it stands to reason the ultimate ending to this experience would be the publication of a peer-reviewed paper. While not all undergraduate projects will end with a product worthy of peer-reviewed publication, some definitely do, and the personal experience of the authors indicates that undergraduate students who achieve publication get great satisfaction and a sense of personal achievement from that publication. While a top-tier international journal probably isn't going to be the ultimate destination for many of these projects, there are several appropriate outlets. The SSSA journal Soil Horizons has published several undergraduate projects in recent years, and good undergraduate projects can often be published in state academy of science journals. Journals focused expressly on publishing undergraduate research include the Journal of Undergraduate Research and Scholarly Excellence, Reinvention, and the American Journal of Undergraduate Research. Case studies of
Hollins Martin, Caroline J; Forrest, Eleanor; Wylie, Linda; Martin, Colin R
The NMSF (2009) survey reported that bereavement midwife care was inadequate in a number of UK NHS Trusts. Using a small grant from the Scottish government, 3 experienced midwifery lecturers designed an interactive workbook called "Shaping bereavement care for midwives in clinical practice" for the purpose of improving delivery of bereavement education to student midwives. An instrument called the Understanding Bereavement Evaluation Tool (UBET) was designed to measure effectiveness of the workbook at equipping students with essential knowledge. To assess validity and reliability of the UBET at measuring midwives' self-perceptions of knowledge surrounding delivery of bereavement care to childbearing women, partners and families who have experienced childbirth related bereavement. An evaluative audit using the UBET was undertaken to explore student midwives' (n=179) self perceived knowledge levels before and after the workbook intervention. Validity tests have shown that the UBET, (6-item version), could be considered a psychometrically robust instrument for assessing students' knowledge gain. PCA identified that the UBET comprised two sub-scales (theoretical knowledge base - Q 1, 2 & 3 and psychosocial elements of care delivery - Q 4, 5 & 6). Data has shown that the easy to administer and short 6-item UBET is a valid and reliable tool for educators to measure success at delivering education using the "Shaping bereavement care for midwives in clinical practice" work book. Copyright © 2012 Elsevier Ltd. All rights reserved.
Maprelian, Eduardo; Cabral, Eduardo L.L.; Silva, Antonio T. e
The loss of coolant accidents (LOCA) in pool type research reactors are normally considered as limiting in the licensing process. This paper verifies the viability of the computer code 3D-AIRLOCA to analyze LOCA in a pool type research reactor, and also develops two computer codes LOSS and TEMPLOCA. The computer code LOSS determines the time tom drawn the pool down to the level of the bottom of the core, and the computer code TEMPLOCA calculates the peak fuel element temperature during the transient. These two coders substitutes the 3D-AIRLOCA in the LOCA analysis for pool type research reactors. (author)
Nikolian, Vahagn C; Ibrahim, Andrew M
Journals fill several important roles within academic medicine, including building knowledge, validating quality of methods, and communicating research. This section provides an overview of these roles and highlights innovative approaches journals have taken to enhance dissemination of research. As journals move away from print formats and embrace web-based content, design-centered thinking will allow for engagement of a larger audience. Examples of recent efforts in this realm are provided, as well as simplified strategies for developing visual abstracts to improve dissemination via social media. Finally, we hone in on principles of learning and education which have driven these advances in multimedia-based communication in scientific research.
Thiel, William H; Giangrande, Paloma H
The development of DNA and RNA aptamers for research as well as diagnostic and therapeutic applications is a rapidly growing field. In the past decade, the process of identifying aptamers has been revolutionized with the advent of high-throughput sequencing (HTS). However, bioinformatics tools that enable the average molecular biologist to analyze these large datasets and expedite the identification of candidate aptamer sequences have been lagging behind the HTS revolution. The Galaxy Project was developed in order to efficiently analyze genome, exome, and transcriptome HTS data, and we have now applied these tools to aptamer HTS data. The Galaxy Project's public webserver is an open source collection of bioinformatics tools that are powerful, flexible, dynamic, and user friendly. The online nature of the Galaxy webserver and its graphical interface allow users to analyze HTS data without compiling code or installing multiple programs. Herein we describe how tools within the Galaxy webserver can be adapted to pre-process, compile, filter and analyze aptamer HTS data from multiple rounds of selection. Copyright © 2015 Elsevier Inc. All rights reserved.
National Institutes of Health, Department of Health and Human Services — Research projects funded by the National Institutes of Health (NIH), other DHHS Operating Divisions (AHRQ, CDC, FDA, HRSA, SAMHSA), and the Department of Veterans...
This brochure highlights selected aspects of the NASA Microgravity Science and Applications program. So that we can expand our understanding and control of physical processes, this program supports basic and applied research in electronic materials, metals, glasses and ceramics, biological materials, combustion and fluids and chemicals. NASA facilities that provide weightless environments on the ground, in the air, and in space are available to U.S. and foreign investigators representing the academic and industrial communities. After a brief history of microgravity research, the text explains the advantages and methods of performing microgravity research. Illustrations follow of equipment used and experiments preformed aboard the Shuttle and of prospects for future research. The brochure concludes be describing the program goals and the opportunities for participation.
Miyungi Odhiambo, Christine Adhiambo
Social media is a phenomenon that has become an important aspect in marketing mix and revolutionizing the way companies interact with customers. It is a new research field and a quick literature scan reveals that not many studies exist. Nevertheless, these few existing studies without scientific evidence with industry data, have rushed to conclude that the emergence of social media has led to the demise of the traditional advertising mainstream media. Therefore, using a scientific research me...
Miguel Cruz Ramírez
Full Text Available In this paper we report a research study geared toward refining an empirical instrument for the selection of experts for educational research, according to its reliability and internal consistency. To this end we used a three-round Delphi technique and subjected the results to a factor analysis. Latent variables were determined that explain the nature of the sources of argumentation necessary for ensuring an adequate level of competence on the part of the experts.
Bates, Matthew E.; Keisler, Jeffrey M.; Zussblatt, Niels P.; Plourde, Kenton J.; Wender, Ben A.; Linkov, Igor
Risk research for nanomaterials is currently prioritized by means of expert workshops and other deliberative processes. However, analytical techniques that quantify and compare alternative research investments are increasingly recommended. Here, we apply value of information and portfolio decision analysis—methods commonly applied in financial and operations management—to prioritize risk research for multiwalled carbon nanotubes and nanoparticulate silver and titanium dioxide. We modify the widely accepted CB Nanotool hazard evaluation framework, which combines nano- and bulk-material properties into a hazard score, to operate probabilistically with uncertain inputs. Literature is reviewed to develop uncertain estimates for each input parameter, and a Monte Carlo simulation is applied to assess how different research strategies can improve hazard classification. The relative cost of each research experiment is elicited from experts, which enables identification of efficient research portfolios—combinations of experiments that lead to the greatest improvement in hazard classification at the lowest cost. Nanoparticle shape, diameter, solubility and surface reactivity were most frequently identified within efficient portfolios in our results.
Rider, Lisa G; Dankó, Katalin; Miller, Frederick W
Clinical registries and biorepositories have proven extremely useful in many studies of diseases, especially rare diseases. Given their rarity and diversity, the idiopathic inflammatory myopathies, or myositis syndromes, have benefited from individual researchers' collections of cohorts of patients. Major efforts are being made to establish large registries and biorepositories that will allow many additional studies to be performed that were not possible before. Here, we describe the registries developed by investigators and patient support groups that are currently available for collaborative research purposes. We have identified 46 myositis research registries, including many with biorepositories, which have been developed for a wide variety of purposes and have resulted in great advances in understanding the range of phenotypes, clinical presentations, risk factors, pathogenic mechanisms, outcome assessment, therapeutic responses, and prognoses. These are now available for collaborative use to undertake additional studies. Two myositis patient registries have been developed for research, and myositis patient support groups maintain demographic registries with large numbers of patients available to be contacted for potential research participation. Investigator-initiated myositis research registries and biorepositories have proven extremely useful in understanding many aspects of these rare and diverse autoimmune diseases. These registries and biorepositories, in addition to those developed by myositis patient support groups, deserve continued support to maintain the momentum in this field as they offer major opportunities to improve understanding of the pathogenesis and treatment of these diseases in cost-effective ways.
Henderson, Vida A; Barr, Kathryn L; An, Lawrence C; Guajardo, Claudia; Newhouse, William; Mase, Rebecca; Heisler, Michele
Together, community-based participatory research (CBPR), user-centered design (UCD), and health information technology (HIT) offer promising approaches to improve health disparities in low-resource settings. This article describes the application of CBPR and UCD principles to the development of iDecide/Decido, an interactive, tailored, web-based diabetes medication education and decision support tool delivered by community health workers (CHWs) to African American and Latino participants with diabetes in Southwest and Eastside Detroit. The decision aid is offered in English or Spanish and is delivered on an iPad in participants' homes. The overlapping principles of CBPR and UCD used to develop iDecide/Decido include a user-focused or community approach, equitable academic and community partnership in all study phases, an iterative development process that relies on input from all stakeholders, and a program experience that is specified, adapted, and implemented with the target community. Collaboration between community members, researchers, and developers is especially evident in the program's design concept, animations, pictographs, issue cards, goal setting, tailoring, and additional CHW tools. The principles of CBPR and UCD can be successfully applied in developing health information tools that are easy to use and understand, interactive, and target health disparities.
Howells, Mark; Pelakauskas, Martynas; Almulla, Youssef; Tkaczyk, Alan H.; Zepeda, Eduardo
Allocating limited resource efficiently is a task to which efficient planning and policy design aspires. This may be a non-trivial task. For example, the seventh sustainable development goal (SDG) of Agenda 2030 is to provide access to affordable sustainable energy to all. On the one hand, energy is required to realise almost all other SDGs. (A clinic requires electricity for fridges to store vaccines for maternal health, irrigate agriculture requires energy to pump water to crops in dry periods etc.) On the other hand, the energy system is non-trivial. It requires the mapping of resource, its conversion into useable energy and then into machines that we use to meet our needs. That requires new tools that draw from standard techniques, best-in-class models and allow the analyst to develop new models. Thus we present the Model Management Infrastructure (MoManI). MoManI is used to develop, manage, run, store input and results data for linear programming models. MoManI, is a browser-based open source interface for systems modelling. It is available to various user audiences, from policy makers and planners through to academics. For example, we implement the Open Source energy Modelling System (OSeMOSYS) in MoManI. OSeMOSYS is a specialized energy model generator. A typical OSeMOSYS model would represent the current energy system of a country, region or city; in it, equations and constraints are specified; and calibrated to a base year. From that future technologies and policy options are represented. From those scenarios are designed and run. Efficient allocation of energy resource and expenditure on technology is calculated. Finally, results are visualized. At present this is done in relatively rigid interfaces or via (for some) cumbersome text files. Implementing and operating OSeMOSYS in MoManI shortens the learning curve and reduces phobia associated with the complexity of computer modelling, thereby supporting effective capacity building activities. The novel
Full text: The construction of the Nuclear Research Center of Maamora (NRCM) will enable to the National Center for Nuclear Energy, Sciences and Techniques (CNESTEN) to fulfill its missions for promotion of nuclear techniques in socioeconomic fields, act as technical support for the authorities, and contribute to the introduction of nuclear power for electricity generation considered in the new energy strategy as alternative option for the period 2020-2030. The CNESTEN has commisioned its nuclear research reactor Triga Mark II of 2000 KW on 2007 for wich the operating authorization was delivered on 2009. This research reactor is the keystone structure of the NRCM, its existing and planed utilization include: production of radioisotopes for medical use, neutron activation analysis, non-destructive examination techniques, neutron scattering, reactor physics research and training. In term of human ressources development, CNESTEN is more focusing on education and training for wich an international training Center is under development. The TRIGA research reactor will be an important component of this center. In order to promote the utilization of the reserch reactor in socio-economical sectors at national level, CNESTEN organizea meetings, schools and conferences around each of the reactor applications, and offers the opportunity to researchers, students, socio-economic operators to know more about reactor utilization within scientific visits, courses and training programs. At the international level, CNESTEN strengthens its international partenership. The regional and international cooperation with IAEA, AFRA and bilateral parteners (USA, France), constitutes the platform for capacity building in different areas of CNESTEN RIGA research reactor utilization
Gui-xiang, Shen; Xian-zhuo, Zhao; Zhang, Ying-zhi; Chen-yu, Han
In order to determine the key components of CNC machine tools under fault rate correlation, a system component criticality analysis method is proposed. Based on the fault mechanism analysis, the component fault relation is determined, and the adjacency matrix is introduced to describe it. Then, the fault structure relation is hierarchical by using the interpretive structure model (ISM). Assuming that the impact of the fault obeys the Markov process, the fault association matrix is described and transformed, and the Pagerank algorithm is used to determine the relative influence values, combined component fault rate under time correlation can obtain comprehensive fault rate. Based on the fault mode frequency and fault influence, the criticality of the components under the fault rate correlation is determined, and the key components are determined to provide the correct basis for equationting the reliability assurance measures. Finally, taking machining centers as an example, the effectiveness of the method is verified.
Wirta, H.; Várkonyi, G.; Rasmussen, C.
DNA sequences offer powerful tools for describing the members and interactions of natural communities. In this study, we establish the to-date most comprehensive library of DNA barcodes for a terrestrial site, including all known macroscopic animals and vascular plants of an intensively studied...... area of the High Arctic, the Zackenberg Valley in Northeast Greenland. To demonstrate its utility, we apply the library to identify nearly 20 000 arthropod individuals from two Malaise traps, each operated for two summers. Drawing on this material, we estimate the coverage of previous morphology...... ongoing shifts in arctic communities and ecosystems. The DNA barcode library now established for Zackenberg offers new scope for such explorations, and for the detailed dissection of interspecific interactions throughout the community....
Keyser, Matthew; Darcy, Eric; Pesaran, Ahmad
Li-ion cells provide the highest specific energy and energy density rechargeable battery with the longest life. Many safety incidents that take place in the field originate due to an internal short that was not detectable or predictable at the point of manufacture. NREL's internal short circuit (ISC) device is capable of simulating shorts and produces consistent and reproducible results. The cell behaves normally until the ISC device is activated wherein a latent defect (i.e., built into the cell during manufacturing) gradually moves into position to create an internal short while the battery is in use, providing relevant data to verify abuse models. The ISC device is an effective tool for studying the safety features of parts of Li-ion batteries.
Cándida Gago García
Full Text Available This paper contains a theoretical reflection about the methodology and meaning given to the global city rankings. There is a very large academic production about the role that some cities have in global territorial processes, which has been related to the concept of global city. Many recent contributions from the mass media, advertising and consulting services must be considered also in the analysis. All of them have included new indicators in order to show the main role that cultural services have acquired in the urban economy. Also the city rankings are being used as a tool in neoliberal policies. These policies stress the position that cities have in the rankings, which are used in practices of city-branding and to justify the neoliberal decisions that are being taken. In fact, we think that rankings are used inappropriately and that it is necessary a deep and new reflection about them.
Binas, B. [Max Delbrueck Center for Molecular Medicine, Berlin-Buch (Germany)
Fatty acid-binding proteins (FABPs) are major targets for specific binding of fatty acids in vivo. They constitute a widely expressed family of genetically related, small cytosolic proteins which very likely mediate intracellular transport of free long chain fatty acids. Genetic inhibition of FABP expression in vivo should therefore provide a useful tool to investigate and engineer fatty acid metabolism. (orig.) [Deutsch] Fettsaeurebindungsproteine (FABPs) sind wichtige Bindungsstellen fuer Fettsaeuren in vivo; sie bilden eine breit exprimierte Familie genetisch verwandter kleiner Zytosoleiweisse, die sehr wahrscheinlich den intrazellulaeren Transport unveresterter langkettiger Fettsaeuren vermitteln. Die genetische Hemmung der FABP-Expanssion in vivo bietet sich deshalb als Werkzeug zur Erforschung und gezielten Veraenderung des Fettsaeurestoffwechsels an. (orig.)
Bancroft, G.; Plessel, T.; Merritt, F.; Watson, V.
Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers. 7 refs