Full Text Available Abstract Background People with rheumatoid arthritis (RA should use DMARDs (disease-modifying anti-rheumatic drugs within the first three months of symptoms in order to prevent irreversible joint damage. However, recent studies report the delay in DMARD use ranges from 6.5 months to 11.5 months in Canada. While most health service delivery interventions are designed to improve the family physician's ability to refer to a rheumatologist and prescribe treatments, relatively little has been done to improve the delivery of credible, relevant, and user-friendly information for individuals to make treatment decisions. To address this care gap, the Animated, Self-serve, Web-based Research Tool (ANSWER will be developed and evaluated to assist people in making decisions about the use of methotrexate, a type of DMARD. The objectives of this project are: 1 to develop ANSWER for people with early RA; and 2 to assess the extent to which ANSWER reduces people's decisional conflict about the use of methotrexate, improves their knowledge about RA, and improves their skills of being 'effective healthcare consumers'. Methods/design Consistent with the International Patient Decision Aid Standards, the development process of ANSWER will involve: 1. creating a storyline and scripts based on the best evidence on the use of methotrexate and other management options in RA, and the contextual factors that affect a patient's decision to use a treatment as found in ERAHSE; 2. using an interactive design methodology to create, test, analyze and refine the ANSWER prototype; 3. testing the content and user interface with health professionals and patients; and 4. conducting a pilot study with 51 patients, who are diagnosed with RA in the past 12 months, to assess the extent to which ANSWER improves the quality of their decisions, knowledge and skills in being effective consumers. Discussion We envision that the ANSWER will help accelerate the dissemination of knowledge and
Li, Linda C; Adam, Paul; Townsend, Anne F; Stacey, Dawn; Lacaille, Diane; Cox, Susan; McGowan, Jessie; Tugwell, Peter; Sinclair, Gerri; Ho, Kendall; Backman, Catherine L
People with rheumatoid arthritis (RA) should use DMARDs (disease-modifying anti-rheumatic drugs) within the first three months of symptoms in order to prevent irreversible joint damage. However, recent studies report the delay in DMARD use ranges from 6.5 months to 11.5 months in Canada. While most health service delivery interventions are designed to improve the family physician's ability to refer to a rheumatologist and prescribe treatments, relatively little has been done to improve the delivery of credible, relevant, and user-friendly information for individuals to make treatment decisions. To address this care gap, the Animated, Self-serve, Web-based Research Tool (ANSWER) will be developed and evaluated to assist people in making decisions about the use of methotrexate, a type of DMARD. The objectives of this project are: 1) to develop ANSWER for people with early RA; and 2) to assess the extent to which ANSWER reduces people's decisional conflict about the use of methotrexate, improves their knowledge about RA, and improves their skills of being 'effective healthcare consumers'. Consistent with the International Patient Decision Aid Standards, the development process of ANSWER will involve: 1.) creating a storyline and scripts based on the best evidence on the use of methotrexate and other management options in RA, and the contextual factors that affect a patient's decision to use a treatment as found in ERAHSE; 2.) using an interactive design methodology to create, test, analyze and refine the ANSWER prototype; 3.) testing the content and user interface with health professionals and patients; and 4.) conducting a pilot study with 51 patients, who are diagnosed with RA in the past 12 months, to assess the extent to which ANSWER improves the quality of their decisions, knowledge and skills in being effective consumers. We envision that the ANSWER will help accelerate the dissemination of knowledge and skills necessary for people with early RA to make informed
Booka, Masayuki; Oku, Hidehisa; Scheller, Andreas; Yamaoka, Shintaro
The research result on usability of Optical Mark Reader Sheet (OMRS) being used as the standard answering tool is reported. The use of OMRS significantly requires more answer time than the answer time without OMRS, and the use of assistive devices for OMRS has the possibility to reduce the answer time.
Barker, James; Pope, Deborah
The "working scientifically" strand of the new primary science curriculum for England has re-emphasised the importance of children having opportunities to carry out different types of enquiries to answer their scientific questions. To promote this as an ongoing aim of primary science education, it is equally important for trainee primary…
Carvalheiro, Luisa G.
Full Text Available Pollination Ecology is a dynamic field of scientific research constantly adopting novel methods and making progress in understanding the interactions between plants and their pollinators. A recent paper listed the main scientific questions in this field focussing on the ecological and biological system itself. Here, we follow up on that paper and present some ideas on how to broaden our perspective and explore the role that pollination research can play in answering both ecological and societal questions relevant to a range of different stakeholders. We hope this paper may be useful to researchers aiming at improving both the scientific and societal impact of their research.
A National Taiwan University (NTU) research team has discovered a new physical phenomenon that could challenge the well-accepted theory about the birth of the universe. However,Lee Shih-chang, a researcher at Academia Sinica's Institute of Physics, said the new scientific research results will be accepted only after an academic paper detailing the research process and conclusions is released and the research results are verified by experts in the field.
Oct 8, 2010 ... Mobile Nav Footer Links ... the environment, and information technology hopes the new research program that ... Alper observes that the Canada Research Chairs program's success in achieving this goal provides one ... Like the Canada Research Chairs program, the IRCI emphasizes training students to ...
Male students consequently perceive their female counterparts to have an unfair advantage ... She presented her research on the connections between gender violence and ... male, and choices of discipline seem to follow old stereotypes.
van den Heuvel-Panhuizen, Marja
This paper problematizes the issue of how decisions about the content of mathematics education can be made. After starting with two examples where research in mathematics education resulted in different choices on the content of primary school teaching, I explore where and how, in the scientific enterprise within the domain of education, issues of…
In this theoretical article, I argue for a relational stance on learning as a way of reckoning with educational research as part of the settler colonial structure of the United States. Because of my geopolitical location to the United States as a settler colony, I begin by contrasting the stances of anticolonial and decolonial. I then analyze the…
Ebrahim, Nader Ale
“Research Tools” can be defined as vehicles that broadly facilitate research and related activities. “Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated ...
Through training materials and guides, we aim to build skills and knowledge to enhance the quality of development research. We also offer free access to our database of funded research projects, known as IDRIS+, and our digital library. Our research tools include. Guide to research databases at IDRC: How to access and ...
van Dijk Frank JH
Full Text Available Abstract Background Common information facilities do not always provide the quality information needed to answer questions on health or health-related issues, such as Occupational Safety and Health (OSH matters. Barriers may be the accessibility, quantity and readability of information. Online Question & Answer (Q&A network tools, which link questioners directly to experts can overcome some of these barriers. When designing and testing online tools, assessing the usability and applicability is essential. Therefore, the purpose of this study is to assess the usability and applicability of a new online Q&A network tool for answers on OSH questions. Methods We applied a cross-sectional usability test design. Eight occupational health experts and twelve potential questioners from the working population (workers were purposively selected to include a variety of computer- and internet-experiences. During the test, participants were first observed while executing eight tasks that entailed important features of the tool. In addition, they were interviewed. Through task observations and interviews we assessed applicability, usability (effectiveness, efficiency and satisfaction and facilitators and barriers in use. Results Most features were usable, though several could be improved. Most tasks were executed effectively. Some tasks, for example searching stored questions in categories, were not executed efficiently and participants were less satisfied with the corresponding features. Participants' recommendations led to improvements. The tool was found mostly applicable for additional information, to observe new OSH trends and to improve contact between OSH experts and workers. Hosting and support by a trustworthy professional organization, effective implementation campaigns, timely answering and anonymity were seen as important use requirements. Conclusions This network tool is a promising new strategy for offering company workers high quality information
Shneerson, Catherine L; Gale, Nicola K
The need for mixed methods research in answering health care questions is becoming increasingly recognized because of the complexity of factors that affect health outcomes. In this article, we argue for the value of using a qualitatively driven mixed method approach for identifying and answering clinically relevant research questions. This argument is illustrated by findings from a study on the self-management practices of cancer survivors and the exploration of one particular clinically relevant finding about higher uptake of self-management in cancer survivors who had received chemotherapy treatment compared with those who have not. A cross-sectional study generated findings that formed the basis for the qualitative study, by informing the purposive sampling strategy and generating new qualitative research questions. Using a quantitative research component to supplement a qualitative study can enhance the generalizability and clinical relevance of the findings and produce detailed, contextualized, and rich answers to research questions that would be unachievable through quantitative or qualitative methods alone. © The Author(s) 2015.
Gabrielsen, Jonas; Jønch-Clausen, Heidi; Pontoppidan, Christina
In the context of political press conferences, the authors explore a particular category of subtle evasions they term shifting. When shifting, the interviewee seemingly accepts to answer the journalist’s question. However, in providing the answer, the interviewee refocuses the question replacing...
Williams, Karen Patricia; Templin, Thomas N; Hines, Resche D
There is a need for health care providers and health care educators to ensure that the messages they communicate are understood. The purpose of this research was to test the reliability and validity, in a culturally diverse sample of women, of a revised Breast Cancer Literacy Assessment Tool (Breast-CLAT) designed to measure functional understanding of breast cancer in English, Spanish, and Arabic. Community health workers verbally administered the 35-item Breast-CLAT to 543 Black, Latina, and Arab American women. A confirmatory factor analysis using a 2-parameter item response theory model was used to test the proposed 3-factor Breast-CLAT (awareness, screening and knowledge, and prevention and control). The confirmatory factor analysis using a 2-parameter item response theory model had a good fit (TLI = .91, RMSEA = .04) to the proposed 3-factor structure. The total scale reliability ranged from .80 for Black participants to .73 for total culturally diverse sample. The three subscales were differentially predictive of family history of cancer. The revised Breast-CLAT scales demonstrated internal consistency reliability and validity in this multiethnic, community-based sample.
United States. Bonneville Power Administration.
Most people know that electric power lines, like the wiring in our homes, can cause serious electric shocks if we`re not careful. Many people also want to know whether the EMF (electric and magnetic fields) produced by power lines and other electrical devices affect our health. Although no adverse health effects of electric power EMF have been confirmed, there is continued scientific uncertainty about this issue. Research on EMF is ongoing throughout the world. The purpose of this booklet is to answer some common questions that the BPA (Bonneville Power Administration) receives about the possible effects of power lines on health. First, some basic electrical terms are defined, and electric and magnetic fields are debed. Next, answers are given to several questions about recent scientific studies. Some important information about electrical safety follows. We then describe how BPA is addressing public concerns about potential health effects of power lines. The last section tells you how to obtain more detailed information about the health and safety issues summarized in this booklet.
Osborne-Gowey, J.; Strittholt, J.; Bergquist, J.; Ward, B. C.; Sheehan, T.; Comendant, T.; Bachelet, D. M.
The world’s aquatic resources are experiencing anthropogenic pressures on an unprecedented scale and aquatic organisms are experiencing widespread population changes and ecosystem-scale habitat alterations. Climate change is likely to exacerbate these threats, in some cases reducing the range of native North American fishes by 20-100% (depending on the location of the population and the model assumptions). Scientists around the globe are generating large volumes of data that vary in quality, format, supporting documentation, and accessibility. Moreover, diverse models are being run at various temporal and spatial scales as scientists attempt to understand previous (and project future) human impacts to aquatic species and their habitats. Conservation scientists often struggle to synthesize this wealth of information for developing practical on-the-ground management strategies. As a result, the best available science is often not utilized in the decision-making and adaptive management processes. As aquatic conservation problems around the globe become more serious and the demand to solve them grows more urgent, scientists and land-use managers need a new way to bring strategic, science-based, and action-oriented approaches to aquatic conservation. The Conservation Biology Institute (CBI), with partners such as ESRI, is developing an Aquatic Center as part of a dynamic, web-based resource (Data Basin; http: databasin.org) that centralizes usable aquatic datasets and provides analytical tools to visualize, analyze, and communicate findings for practical applications. To illustrate its utility, we present example datasets of varying spatial scales and synthesize multiple studies to arrive at novel solutions to aquatic threats.
Green plants are the ultimate source of all resources required for man's life, his food, his clothes, and almost all his energy requirements. Primitive prehistoric man could live from the abundance of nature surrounding him. Man today, dominating nature in terms of numbers and exploiting its limited resources, cannot exist without employing his intelligence to direct natural evolution. Plant sciences, therefore, are not a matter of curiosity but an essential requirement. From such considerations, the IAEA and FAO jointly organized a symposium to assess the value of mutation research for various kinds of plant science, which directly or indirectly might contribute to sustaining and improving crop production. The benefit through developing better cultivars that plant breeders can derive from using the additional genetic resources resulting from mutation induction has been assessed before at other FAO/IAEA meetings (Rome 1964, Pullman 1969, Ban 1974, Ibadan 1978) and is also monitored in the Mutation Breeding Newsletter, published by IAEA twice a year. Several hundred plant cultivars which carry economically important characters because their genes have been altered by ionizing radiation or other mutagens, are grown by farmers and horticulturists in many parts of the world. But the benefit derived from such mutant varieties is without any doubt surpassed by the contribution which mutation research has made towards the advancement of genetics. For this reason, a major part of the papers and discussions at the symposium dealt with the role induced-mutation research played in providing insight into gene action and gene interaction, the organization of genes in plant chromosomes in view of homology and homoeology, the evolutionary role of gene duplication and polyploidy, the relevance of gene blocks, the possibilities for chromosome engineering, the functioning of cytroplasmic inheritance and the genetic dynamics of populations. In discussing the evolutionary role of
Wells, Amy Stuart; Roda, Allison
This chapter examines how the larger political context and policies enacted at different points in American history have affected the questions education researchers asked and answered. The authors argue that while education researchers are often quick to consider how their research should shape policy, they are less likely to contemplate the…
Full Text Available The research tools refer to the resources researchers need to use in experimental work. In Biotechnology, these can include cell lines, monoclonal antibodies, reagents, animal models, growth factors, combinatorial chemistry libraries, drug and drug targets, clones and cloning tools (such as PCR, method, laboratory equipment and machines, database and computer software. Research tools therefore serve as basis for upstream research to improve the present product or process. There are several challenges in the way of using patented research tools. IP issues with regard to research tools are important and may sometime pose hindrance for researchers. Hence in the case of patented research tools, IPR issues can compose a major hurdle for technology development. In majority instances research tools are permitted through MTAs for academic research and for imparting education. TRIPS provides a provision for exception to patent rights for experimental use of patented technology in scientific research and several countries including India have included this provision in their patent legislation. For commercially important work, licensing of research tools can be based on royalty or one time lump sum payment. Some patent owners of important high-end research tools for development of platform technology create problems in licensing which can impede research. Usually cost of a commercially available research tool is built up in its price.
Wininger, Michael; Pidcoe, Peter
The Academy of Pediatric Physical Therapy Research Summit IV issued a Call to Action for community-wide intensification of a research enterprise in inquiries related to pediatric brain injury and motor disability by way of technological integration. But the barriers can seem high, and the pathways to integrative clinical research can seem poorly marked. Here, we answer the Call by providing framework to 3 objectives: (1) instrumentation, (2) biometrics and study design, and (3) data analytics. We identify emergent cases where this Call has been answered and advocate for others to echo the Call both in highly visible physical therapy venues and in forums where the audience is diverse.
Mahajan Ashwini; Prof. B.V. Jain; Dr Surajj Sarode
A centrifuge is a critical piece of equipment for the laboratory. Purpose of this study was to study research centrifuge in detail, its applications, uses in different branches and silent features. Their are two types of research centrifuge study here revolutionary research centrifuge and microprocessor research centrifuge. A centrifuge is a device that separates particles from a solution through use of a rotor. In biology, the particles are usually cells, sub cellular organelles, or large mo...
Rhebergen, Martijn D. F.; Hulshof, Carel T. J.; Lenderink, Annet F.; van Dijk, Frank J. H.
Common information facilities do not always provide the quality information needed to answer questions on health or health-related issues, such as Occupational Safety and Health (OSH) matters. Barriers may be the accessibility, quantity and readability of information. Online Question & Answer (Q&A)
McColgin, Dave W.; Gregory, Michelle L.; Hetzler, Elizabeth G.; Turner, Alan E.
Research in Question Answering has focused on the quality of information retrieval or extraction using the metrics of precision and recall to judge success; these metrics drive toward finding the specific best answer(s) and are best supportive of a lookup type of search. These do not address the opportunity that users? natural language questions present for exploratory interactions. In this paper, we present an integrated Question Answering environment that combines a visual analytics tool for unstructured text and a state-of-the-art query expansion tool designed to compliment the cognitive processes associated with an information analysts work flow. Analysts are seldom looking for factoid answers to simple questions; their information needs are much more complex in that they may be interested in patterns of answers over time, conflicting information, and even related non-answer data may be critical to learning about a problem or reaching prudent conclusions. In our visual analytics tool, questions result in a comprehensive answer space that allows users to explore the variety within the answers and spot related information in the rest of the data. The exploratory nature of the dialog between the user and this system requires tailored evaluation methods that better address the evolving user goals and counter cognitive biases inherent to exploratory search tasks.
United States. Bonneville Power Administration.
Most people know that electric power lines, like the wiring in our homes, can cause serious electric shocks if we`re not careful. Many people also want to know whether the electric and magnetic fields (EMF) produced by power lines and other electrical devices cause health effects. The purpose of this booklet is to answer some common questions that the Bonneville Power Administration (BPA) receives about the possible effects of power lines on health. First, some basic electrical terms are defined, and electric and magnetic fields are debed. Next, answers are given to several questions about recent scientific studies. Some important information about electrical safety follows. We then describe how BPA is addressing public concerns about potential health effects of power lines. The last section tells you how to obtain more detailed information about the health and safety issues summarized in this booklet.
United States. Bonneville Power Administration.
Most people know that electric power lines, like the wiring in our homes, can cause serious electric shocks if we`re not careful. Many people also want to know whether the electric and magnetic fields (EMF) produced by power lines and other electrical devices cause health effects. The purpose of this pamphlet is to answer some common questions that the Bonneville Power Administration (BPA) receives about the possible effects of power lines on health. (BPA is the Pacific Northwest`s Federal electric power marketing agency.) First, some basic electrical terms are defined, and electric and magnetic fields are described. Next, answers are given to several questions about recent scientific studies. We then describe how BPA is addressing public concerns raised by these studies. Some important information about electrical safety follows. The last section tells you how to obtain more detailed information about the health and safety issues summarized in this pamphlet.
Rhebergen, M.D.F.; Hulshof, C.T.J.; Lenderink, A.F.; van Dijk, F.J.H.
ABSTRACT: BACKGROUND: Common information facilities do not always provide the quality information needed to answer questions on health or health-related issues, such as Occupational Safety and Health (OSH) matters. Barriers may be the accessibility, quantity and readability of information. Online
Cunningham, John A; Neighbors, Clayton; Bertholet, Nicolas; Hendershot, Christian S
There is a growing use of mobile devices to access the Internet. We examined whether participants who used a mobile device to access a brief online survey were quicker to respond to the survey but also, less likely to complete it than participants using a traditional web browser. Using data from a recently completed online intervention trial, we found that participants using mobile devices were quicker to access the survey but less likely to complete it compared to participants using a traditional web browser. More concerning, mobile device users were also less likely to respond to a request to complete a six week follow-up survey compared to those using traditional web browsers. With roughly a third of participants using mobile devices to answer an online survey in this study, the impact of mobile device usage on survey completion rates is a concern. ClinicalTrials.gov: NCT01521078.
Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C
This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.
Full Text Available To improve the learning of basic concepts in molecular biology of an undergraduate science class, a pedagogical tool was developed, consisting of learning objectives listed at the end of each lecture and answers to those objectives made available as videos online. The aim of this study was to determine if the pedagogical tool was used by students as instructed, and to explore students’ perception of its usefulness. A combination of quantitative survey data and measures of online viewing was used to evaluate the usage of the pedagogical practice. A total of 77 short videos linked to 11 lectures were made available to 71 students, and 64 completed the survey. Using online tracking tools, a total of 7046 views were recorded. Survey data indicated that most students (73.4% accessed all videos, and the majority (98.4% found the videos to be useful in assisting their learning. Interestingly, approximately half of the students (53.1% always or most of the time used the pedagogical tool as recommended, and consistently answered the learning objectives before watching the videos. While the proposed pedagogical tool was used by the majority of students outside the classroom, only half used it as recommended limiting the impact on students’ involvement in the learning of the material presented in class.
such previous work, two case studies are presented, in which drawings helped investigate the relationship between media technology users and two specific devices, namely television and mobile phones. The experiment generated useful data and opened for further consideration of the method as an appropriate HCI...... research tool....
Pantula, Sastry; Dickey, David
Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...
Stender, V.; Schroeder, M.; Wächter, J.
Established initiatives and mandated organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. The basic idea behind these infrastructures is the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. Especially the management of research data is gaining more and more importance. In geosciences these developments have to be merged with the enhanced data management approaches of Spatial Data Infrastructures (SDI). The Centre for GeoInformationTechnology (CeGIT) at the GFZ German Research Centre for Geosciences has the objective to establish concepts and standards of SDIs as an integral part of research infrastructure architectures. In different projects, solutions to manage research data for land- and water management or environmental monitoring have been developed based on a framework consisting of Free and Open Source Software (FOSS) components. The framework provides basic components supporting the import and storage of data, discovery and visualization as well as data documentation (metadata). In our contribution, we present our data management solutions developed in three projects, Central Asian Water (CAWa), Sustainable Management of River Oases (SuMaRiO) and Terrestrial Environmental Observatories (TERENO) where FOSS components build the backbone of the data management platform. The multiple use and validation of tools helped to establish a standardized architectural blueprint serving as a contribution to Research Infrastructures. We examine the question of whether FOSS tools are really a sustainable choice and whether the increased efforts of maintenance are justified. Finally it should help to answering the question if the use of FOSS for Research Infrastructures is a
Open-ended probing questions in cross-cultural surveys help uncover equivalence problems in cross-cultural survey research. For languages that a project team does not understand, probe answers need to be translated into a common project language. This article presents a case study on translating open-ended, that is, narrative answers. It describes…
Sloan Work and Family Research Network, 2009
The Sloan Work and Family Research Network has prepared Fact Sheets that provide statistical answers to some important questions about work-family and work-life issues. This Fact Sheet includes statistics about Children in Self-Care, and answers the following questions about school-age children in self-care: (1) How many school-age children are in…
Full Text Available Objective ‐ To examine the similarities and differences between research questions asked by librarians in 2001 to those posed in 2006, and to explore to what extent the published research supports the questions being asked.Methods ‐ Questions collected in 2001 by members of the Evidence‐Based Librarianship Implementation Committee (EBLIC of the MLA Research Section were compared with questions collected in 2006 at a cross‐sectoral seminar introducing evidence based library and information practice to Australian librarians. Questions from each list were categorized using the domains of librarianship proposed by Crumley and Koufogiannakis in 2001, and examined with reference to a content analysis of the library and information studies (LIS research published in 2001 by Koufogiannakis, Slater, and Crumley in 2004.Results ‐ In 2001 and 2006 the most commonly asked questions were in the domain of management (29%, 33%, followed by education (24%, 18.5%. In 2001 questions in the marketing/promotion category ranked lowest (1%, however representation was much greater in 2006 (18.5% ranking an equal second with education. Questions in the lowest ranked domain in 2006 (collections, 6% had been more common in 2001 where collections ranked third, representing 19% of the questions. Koufogiannakis, Slater, and Crumley’s content analysis of LIS research published in 2001 revealed that the most popular domain for research was information access and retrieval (38% followed by collections (24%. Only 1% of published LIS research (seven articles was in the domain of marketing/promotion. In contrast, 36 articles originally assigned to one of the six established domains could more appropriately have been included in a proposed new domain of professional issues.Conclusion ‐ The disparity between questions being asked by practitioners and the evidence being generated by researchers suggests that the research‐practice gap is still an issue. A content
Foubert, John D.
Rape prevention programmers and researchers have long struggled to select the most appropriate theoretical models to frame their work. Questions abound regarding appropriate standards of evidence for success of program interventions. The present article provides an alternative point of view to the one put forward by seven staff members from the…
Doubal, Fergus N; Ali, Myzoon; Batty, G David; Charidimou, Andreas; Eriksdotter, Maria; Hofmann-Apitius, Martin; Kim, Yun-Hee; Levine, Deborah A; Mead, Gillian; Mucke, Hermann A M; Ritchie, Craig W; Roberts, Charlotte J; Russ, Tom C; Stewart, Robert; Whiteley, William; Quinn, Terence J
Traditional approaches to clinical research have, as yet, failed to provide effective treatments for vascular dementia (VaD). Novel approaches to collation and synthesis of data may allow for time and cost efficient hypothesis generating and testing. These approaches may have particular utility in helping us understand and treat a complex condition such as VaD. We present an overview of new uses for existing data to progress VaD research. The overview is the result of consultation with various stakeholders, focused literature review and learning from the group's experience of successful approaches to data repurposing. In particular, we benefitted from the expert discussion and input of delegates at the 9 th International Congress on Vascular Dementia (Ljubljana, 16-18 th October 2015). We agreed on key areas that could be of relevance to VaD research: systematic review of existing studies; individual patient level analyses of existing trials and cohorts and linking electronic health record data to other datasets. We illustrated each theme with a case-study of an existing project that has utilised this approach. There are many opportunities for the VaD research community to make better use of existing data. The volume of potentially available data is increasing and the opportunities for using these resources to progress the VaD research agenda are exciting. Of course, these approaches come with inherent limitations and biases, as bigger datasets are not necessarily better datasets and maintaining rigour and critical analysis will be key to optimising data use.
Full Text Available Over the past decade, northern Canada has experienced a substantial increase in government reliance on advisory co-management organizations to manage caribou populations. Such groups, which are usually composed of government and local representatives, constantly require information about caribou upon which to base their recommendations. However, the standard 'scientific' approach to obtaining and presenting such information is in many cases no longer appropriate. In order to readjust the scientific focus on caribou research so that it is better attuned to co-management, this paper examines the role that research plays in the Canadian management of the Porcupine Caribou Herd as practiced by the Porcupine Caribou Management Board - a co-management advisory organization with a majority of native representatives.
Full Text Available Participant reports of their own behaviour are critical for the provision and evaluation of behavioural interventions. Recent developments in brief alcohol intervention trials provide an opportunity to evaluate longstanding concerns that answering questions on behaviour as part of research assessments may inadvertently influence it and produce bias. The study objective was to evaluate the size and nature of effects observed in randomized manipulations of the effects of answering questions on drinking behaviour in brief intervention trials.Multiple methods were used to identify primary studies. Between-group differences in total weekly alcohol consumption, quantity per drinking day and AUDIT scores were evaluated in random effects meta-analyses. Ten trials were included in this review, of which two did not provide findings for quantitative study, in which three outcomes were evaluated. Between-group differences were of the magnitude of 13.7 (-0.17 to 27.6 grams of alcohol per week (approximately 1.5 U.K. units or 1 standard U.S. drink and 1 point (0.1 to 1.9 in AUDIT score. There was no difference in quantity per drinking day.Answering questions on drinking in brief intervention trials appears to alter subsequent self-reported behaviour. This potentially generates bias by exposing non-intervention control groups to an integral component of the intervention. The effects of brief alcohol interventions may thus have been consistently under-estimated. These findings are relevant to evaluations of any interventions to alter behaviours which involve participant self-report.
Julius, Matthew L.; Schoenfuss, Heiko L.
This laboratory exercise introduces students to a fundamental tool in evolutionary biology--phylogenetic inference. Students are required to create a data set via observation and through mining preexisting data sets. These student data sets are then used to develop and compare competing hypotheses of vertebrate phylogeny. The exercise uses readily…
Sholtys, Phyllis A.
The development of information systems using an engineering approach employing both traditional programming techniques and nonprocedural languages is described. A fourth generation application tool is used to develop a prototype system that is revised and expanded as the user clarifies individual requirements. When fully defined, a combination of…
Vellinga, P.; Van Dorland, R.; Kabat, P.
In some of the previous issues of this magazine (Spil 2007, issue 4 and 5-6, and Spil 2008, issue 1) the authors Labohm, Roersch and Thoenes started a frontal attack of the greenhouse theory and the researchers who report on the state of science in the framework of the IPCC. The author of this article addresses two main questions arising from the above-mentioned authors: (1) Does the use of fossil fuels affect global climate?; and (2) Is the warming of the last 30 years related to the increasing concentrations of greenhouse gases in the atmosphere? [mk] [nl
Among the publications of CDC Climat Research, 'Tendances Carbone' bulletin specifically studies the developments of the European market for CO 2 allowances. This issue addresses the following points: To establish a climate and energy policy in the EU in 2030, CDC Climat Research addresses three main recommendations to the European Commission: (1) Establish a binding, single and ambitious CO 2 emission reduction target of at least 40% in 2030. (2) Put the EU ETS as the central and non-residual instrument aimed at promoting cost-effective reductions in Europe and other parts of the world. (3) Define a stable, predictable and flexible climate regulation to limit carbon leakage and encourage innovation. Key drivers of the European carbon price this month: - The European Parliament has adopted Back-loading: 1.85 billion EUAs will be sold at auction between now and 2015 instead of 2.75 billion; - Phase 2 compliance: a surplus of 1,742 million tonnes (excluding the aviation sector) including auctions. - Energy Efficiency Directive: 22 of the 27 Member States have forwarded indicative targets for 2020 to the European Commission; these targets will be assessed in early 2014
Stender, Vivien; Jankowski, Cedric; Hammitzsch, Martin; Wächter, Joachim
Established initiatives and organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. These infrastructures aim the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. In this regard, Research Data Management (RDM) gains importance and thus requires the support by appropriate tools integrated in these infrastructures. Different projects provide arbitrary solutions to manage research data. However, within two projects - SUMARIO for land and water management and TERENO for environmental monitoring - solutions to manage research data have been developed based on Free and Open Source Software (FOSS) components. The resulting framework provides essential components for harvesting, storing and documenting research data, as well as for discovering, visualizing and downloading these data on the basis of standardized services stimulated considerably by enhanced data management approaches of Spatial Data Infrastructures (SDI). In order to fully exploit the potentials of these developments for enhancing data management in Geosciences the publication of software components, e.g. via GitHub, is not sufficient. We will use our experience to move these solutions into the cloud e.g. as PaaS or SaaS offerings. Our contribution will present data management solutions for the Geosciences developed in two projects. A sort of construction kit with FOSS components build the backbone for the assembly and implementation of projects specific platforms. Furthermore, an approach is presented to stimulate the reuse of FOSS RDM solutions with cloud concepts. In further projects specific RDM platforms can be set-up much faster, customized to the individual needs and tools can be added during the run-time.
Didier, Damien; Gariel, Jean-Christophe; Bruno, Valerie; Debayle, Christophe
Very highly efficient filters containing a porous glass fibre fabric are used in industrial installations to trap radioactive or toxic particles in order to limit their release, notably in accidental situations. Thus this set of articles discusses various issues related to the use of such filters. A first one describes how air radioactivity is continuously monitored by two coexisting networks: Opera-Air and Teleray. It indicates where air radioactivity comes from, and how the origin of a release can be determined, and outlines the importance of modelling tools. Air monitoring about the Gravelines nuclear power plant is briefly presented with a drawing. A second article comments the existence of numerous tools which are used as information channels about the monitoring of air radioactivity: web sites, mobile application, and so on. The last article briefly describes the journey of a filter from its removal on a Monday to a complete and validated analysis which lasts between two and four weeks
Straightforward Statistics: Understanding the Tools of Research is a clear and direct introduction to statistics for the social, behavioral, and life sciences. Based on the author's extensive experience teaching undergraduate statistics, this book provides a narrative presentation of the core principles that provide the foundation for modern-day statistics. With step-by-step guidance on the nuts and bolts of computing these statistics, the book includes detailed tutorials how to use state-of-the-art software, SPSS, to compute the basic statistics employed in modern academic and applied researc
As to this case written in the title which was inquired on July 19, 1994, from the prime minister, and changed partly on November 21, 1994, the Nuclear Safety Commission answered to the prime minister as follows after the prudent deliberation. As for the application of the criteria for permission, the technical capability is adequate, and the results of the examination of safety by the expert committee for examining nuclear fuel safety is adequate. It was judged that the safety after the permission of this waste-burying business can be secured. The expert committee reported on the policy of the investigation and deliberation, and the contents of the investigation and deliberation, such as the basic location conditions, namely, site, weather, ground, hydraulics, earthquakes and social environment, the radioactive wastes to be buried, the method of determining radioactivity concentration, the expected time of changing the measures to be taken for security, the safety design for the waste-burying facility related to radiation control, environment safety, earthquakes, fires and explosion, the loss of electric power and the standards and criteria to be conformed, and the assessment of dose equivalent in normal state, after finishing the period of control and safety evaluation, and the course of the investigation and deliberation. (K.I.)
Kevin R. Butt
Full Text Available Earthworms are responsible for soil development, recycling organic matter and form a vital component within many food webs. For these and other reasons earthworms are worthy of investigation. Many technologically-enhanced approaches have been used within earthworm-focused research. These have their place, may be a development of existing practices or bring techniques from other fields. Nevertheless, let us not overlook the fact that much can still be learned through utilisation of more basic approaches which have been used for some time. New does not always equate to better. Information on community composition within an area and specific population densities can be learned using simple collection techniques, and burrowing behaviour can be determined from pits, resin-insertion or simple mesocosms. Life history studies can be achieved through maintenance of relatively simple cultures. Behavioural observations can be undertaken by direct observation or with low cost webcam usage. Applied aspects of earthworm research can also be achieved through use of simple techniques to enhance population development and even population dynamics can be directly addressed with use of relatively inexpensive, effective marking techniques. This paper seeks to demonstrate that good quality research in this sphere can result from appropriate application of relatively simple research tools.
Butt, K.R.; Grigoropoulou, N.
Earthworms are responsible for soil development, recycling organic matter and form a vital component within many food webs. For these and other reasons earthworms are worthy of investigation. Many technologically-enhanced approaches have been used within earthworm-focused research. These have their place, may be a development of existing practices or bring techniques from other fields. Nevertheless, let us not overlook the fact that much can still be learned through utilisation of more basic approaches which have been used for some time. New does not always equate to better. Information on community composition within an area and specific population densities can be learned using simple collection techniques, and burrowing behaviour can be determined from pits, resin-insertion or simple mesocosms. Life history studies can be achieved through maintenance of relatively simple cultures. Behavioural observations can be undertaken by direct observation or with low cost we became usage. Applied aspects of earthworm research can also be achieved through use of simple techniques to enhance population development and even population dynamics can be directly addressed with use of relatively inexpensive, effective marking techniques. This paper seeks to demonstrate that good quality research in this sphere can result from appropriate application of relatively simple research tools.
Michael A. Langston
Full Text Available Despite staggering investments made in unraveling the human genome, current estimates suggest that as much as 90% of the variance in cancer and chronic diseases can be attributed to factors outside an individual’s genetic endowment, particularly to environmental exposures experienced across his or her life course. New analytical approaches are clearly required as investigators turn to complicated systems theory and ecological, place-based and life-history perspectives in order to understand more clearly the relationships between social determinants, environmental exposures and health disparities. While traditional data analysis techniques remain foundational to health disparities research, they are easily overwhelmed by the ever-increasing size and heterogeneity of available data needed to illuminate latent gene x environment interactions. This has prompted the adaptation and application of scalable combinatorial methods, many from genome science research, to the study of population health. Most of these powerful tools are algorithmically sophisticated, highly automated and mathematically abstract. Their utility motivates the main theme of this paper, which is to describe real applications of innovative transdisciplinary models and analyses in an effort to help move the research community closer toward identifying the causal mechanisms and associated environmental contexts underlying health disparities. The public health exposome is used as a contemporary focus for addressing the complex nature of this subject.
Rather than produce clear-cut answers to well-defined problems, research on future environmental policy issues requires a different approach whereby researchers are partners in joint learning processes among stakeholders, policy makers, NGOs (Non-Governmental Organisations) and industry. This
Katherine D. Seelman
Full Text Available The importance of public policy as a complementary framework for telehealth, telemedicine, and by association telerehabilitation, has been recognized by a number of experts. The purpose of this paper is to review literature on telerehabilitation (TR policy and research methodology issues in order to report on the current state of the science and make recommendations about future research needs. An extensive literature search was implemented using search terms grouped into main topics of telerehabilitation, policy, population of users, and policy specific issues such as cost and reimbursement. The availability of rigorous and valid evidence-based cost studies emerged as a major challenge to the field. Existing cost studies provided evidence that telehomecare may be a promising application area for TR. Cost studies also indicated that telepsychiatry is a promising telepractice area. The literature did not reference the International Classification on Functioning, Disability and Health (ICF. Rigorous and comprehensive TR assessment and evaluation tools for outcome studies are tantamount to generating confidence among providers, payers, clinicians and end users. In order to evaluate consumer satisfaction and participation, assessment criteria must include medical, functional and quality of life items such as assistive technology and environmental factors. Keywords: Telerehabilitation, Telehomecare, Telepsychiatry, Telepractice
Andreia Salvan Pagnan
Full Text Available Within the women's clothing of the universe's underwear were long an insignificant plan with regard to the development of new textile materials, shapes and colors. The panties that had been known as breeches or long underwear only became a necessity around the twentieth century with the vaporous dresses Christian Dior in the 50 Technological advances in the textile industry brought spandex created by the American laboratory DuPont's better known as the lycra. The elasticity of the fabric gave comfort to women's lingerie, passing this attribute to be considered as a quality factor in lingeries. To understand the desires of the users a qualitative research was conducted with women 18-45 years collecting opinions on the perceived comfort of already existing models compared to a new one be launched. Through the Quality Function Deployment Tool (QFD, or Quality Function Deployment, the data obtained from users of the answers given an interpretation which is to prioritize targets for the development of a based product on analyzes of desired characteristics which are converted into attributes technicians.
Greene, Gretchen; Donley, J.; Rodney, S.; LAZIO, J.; Koekemoer, A. M.; Busko, I.; Hanisch, R. J.; VAO Team; CANDELS Team
The formation of galaxies and their co-evolution with black holes through cosmic time are prominent areas in current extragalactic astronomy. New methods in science research are building upon collaborations between scientists and archive data centers which span large volumes of multi-wavelength and heterogeneous data. A successful example of this form of teamwork is demonstrated by the CANDELS (Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey) and the Virtual Astronomical Observatory (VAO) collaboration. The CANDELS project archive data provider services are registered and discoverable in the VAO through an innovative web based Data Discovery Tool, providing a drill down capability and cross-referencing with other co-spatially located astronomical catalogs, images and spectra. The CANDELS team is working together with the VAO to define new methods for analyzing Spectral Energy Distributions of galaxies containing active galactic nuclei, and helping to evolve advanced catalog matching methods for exploring images of variable depths, wavelengths and resolution. Through the publication of VOEvents, the CANDELS project is publishing data streams for newly discovered supernovae that are bright enough to be followed from the ground.
This paper will discuss some of the tooling necessary to manufacture aluminum-based research reactor fuel plates. Most of this tooling is intended for use in a high-production facility. Some of the tools shown have manufactured more than 150,000 pieces. The only maintenance has been sharpening. With careful design, tools can be made to accommodate the manufacture of several different fuel elements, thus, reducing tooling costs and maintaining tools that the operators are trained to use. An important feature is to design the tools using materials with good lasting quality. Good tools can increase return on investment. (author)
This paper will discuss some of the tooling necessary to manufacture aluminum-based research reactor fuel plates. Most of this tooling is intended for use in a high-production facility. Some of the tools shown have manufactured more than 150,000 pieces. The only maintenance has been sharpening. With careful design, tools can be made to accommodate the manufacture of several different fuel elements, thus, reducing tooling costs and maintaining tools that the operators are trained to use. An important feature is to design the tools using materials with good lasting quality. Good tools can increase return on investment
Federated searching was once touted as the library world's answer to Google, but ten years since federated searching technology's inception, how does it actually compare? This study focuses on undergraduate student preferences and perceptions when doing research using both Google and a federated search tool. Students were asked about their…
Greene, Sarah M.; Baldwin, Laura-Mae; Dolor, Rowena J.; Thompson, Ella; Neale, Anne Victoria
Over the past two decades, the health research enterprise has matured rapidly, and many recognize an urgent need to translate pertinent research results into practice, to help improve the quality, accessibility, and affordability of U.S. health care. Streamlining research operations would speed translation, particularly for multi-site collaborations. However, the culture of research discourages reusing or adapting existing resources or study materials. Too often, researchers start studies and...
Andrea Calsamiglia Madurga
Full Text Available We present a theoretical and epistemological reflection on Forum Theater’s potential as a Research Tool. Our presence on social action and research has led us to a double reflection on qualitative research’s limitations on the affect studies and the Forum Theater’s potential as a research tool to tackle research about affects. After some specific experiences in action research (qualitative research on romantic love and gender violence, and the creation process of the Forum Theater “Is it a joke?”, we explore Forum Theatre’s possibilities as a research tool in the feminist epistemology framework.
Nijssen, E.J.; Frambach, R.T.
This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies’ turnover, (2) MR companies’ awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers’ perceptions of the influence of client
Nijssen, Edwin J.; Frambach, Ruud T.
This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies' turnover, (2) MR companies' awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers' perceptions of the influence of client
Techniques to quantify ephemeral gully erosion have been identified by USDA Natural Resources Conservation Service (NRCS) as one of gaps in current erosion assessment tools. One reason that may have contributed to this technology gap is the difficulty to quantify changes in channel geometry to asses...
The Volpe Center developed a marketing research primer which provides a guide to the approach, procedures, and research tools used by private industry in predicting consumer response. The final two chapters of the primer focus on the challenges of do...
O V Verkhodanov
Full Text Available We describe the current status of CATS (astrophysical CATalogs Support system, a publicly accessible tool maintained at Special Astrophysical Observatory of the Russian Academy of Sciences (SAO RAS (http://cats.sao.ru allowing one to search hundreds of catalogs of astronomical objects discovered all along the electromagnetic spectrum. Our emphasis is mainly on catalogs of radio continuum sources observed from 10 MHz to 245 GHz, and secondly on catalogs of objects such as radio and active stars, X-ray binaries, planetary nebulae, HII regions, supernova remnants, pulsars, nearby and radio galaxies, AGN and quasars. CATS also includes the catalogs from the largest extragalactic surveys with non-radio waves. In 2008 CATS comprised a total of about 109 records from over 400 catalogs in the radio, IR, optical and X-ray windows, including most source catalogs deriving from observations with the Russian radio telescope RATAN-600. CATS offers several search tools through different ways of access, e.g. via Web-interface and e-mail. Since its creation in 1997 CATS has managed about 105requests. Currently CATS is used by external users about 1500 times per day and since its opening to the public in 1997 has received about 4000 requests for its selection and matching tasks.
Cunningham, Barbara Jane; Hidecker, Mary Jo Cooley; Thomas-Stonell, Nancy; Rosenbaum, Peter
In this paper, we present our experiences - both successes and challenges - in implementing evidence-based classification tools into clinical practice. We also make recommendations for others wanting to promote the uptake and application of new research-based assessment tools. We first describe classification systems and the benefits of using them in both research and practice. We then present a theoretical framework from Implementation Science to report strategies we have used to implement two research-based classification tools into practice. We also illustrate some of the challenges we have encountered by reporting results from an online survey investigating 58 Speech-language Pathologists' knowledge and use of the Communication Function Classification System (CFCS), a new tool to classify children's functional communication skills. We offer recommendations for researchers wanting to promote the uptake of new tools in clinical practice. Specifically, we identify structural, organizational, innovation, practitioner, and patient-related factors that we recommend researchers address in the design of implementation interventions. Roles and responsibilities of both researchers and clinicians in making implementations science a success are presented. Implications for rehabilitation Promoting uptake of new and evidence-based tools into clinical practice is challenging. Implementation science can help researchers to close the knowledge-to-practice gap. Using concrete examples, we discuss our experiences in implementing evidence-based classification tools into practice within a theoretical framework. Recommendations are provided for researchers wanting to implement new tools in clinical practice. Implications for researchers and clinicians are presented.
Andersen, Jakob Axel Bejbro; Howard, Thomas J.; McAloone, Tim C.
. This paper elucidates the requirements for such tools by drawing on knowledge of the entrepreneurial phenomenon and by building on the existing research tools used in design research. On this basis, the development of a capture method for tech startup processes is described and its potential discussed....
Narratives and activity theory are useful as socially constructed data collection tools that allow a researcher access to the social, cultural and historical meanings that research participants place on events in their lives. This case study shows how these tools were used to promote reflection within a cultural-historical activity theoretically…
Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.
Lobashev, V.M.; Tavkhelidze, A.N.
A meson facility is being built at the Institute of Nuclear Research, USSR Academy of Sciences, in Troitsk, where the Scientific Center, USSR Academy of Sciences is located. The facility will include a linear accelerator for protons and negative hydrogen ions with 600 MeV energy and 0.5-1 mA beam current. Some fundamental studies that can be studied at a meson facility are described in the areas of elementary particles, neutron physics, solid state physics, and applied research. The characteristics of the linear accelerator are given and the meson facility's experimental complex is described
Ebrahim, Nader Ale
“Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated periodically. “Research Tools” consists of a hierarchical set of nodes. It has four main nodes: (1)...
Ebrahim, Nader Ale
“Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated periodically. “Research Tools” consists of a hierarchical set of nodes. It has four main nodes: (1)...
View, Jenice L.; DeMulder, Elizabeth; Stribling, Stacia; Dodman, Stephanie; Ra, Sophia; Hall, Beth; Swalwell, Katy
This is a three-part essay featuring six teacher educators and one classroom teacher researcher. Part one describes faculty efforts to build curriculum for teacher research, scaffold the research process, and analyze outcomes. Part two shares one teacher researcher's experience using an equity audit tool in several contexts: her teaching practice,…
Fields, MaryAnne; Brewer, Ralph; Edge, Harris L.; Pusey, Jason L.; Weller, Ed; Patel, Dilip G.; DiBerardino, Charles A.
The Robotics Collaborative Technology Alliance (RCTA) program focuses on four overlapping technology areas: Perception, Intelligence, Human-Robot Interaction (HRI), and Dexterous Manipulation and Unique Mobility (DMUM). In addition, the RCTA program has a requirement to assess progress of this research in standalone as well as integrated form. Since the research is evolving and the robotic platforms with unique mobility and dexterous manipulation are in the early development stage and very expensive, an alternate approach is needed for efficient assessment. Simulation of robotic systems, platforms, sensors, and algorithms, is an attractive alternative to expensive field-based testing. Simulation can provide insight during development and debugging unavailable by many other means. This paper explores the maturity of robotic simulation systems for applications to real-world problems in robotic systems research. Open source (such as Gazebo and Moby), commercial (Simulink, Actin, LMS), government (ANVEL/VANE), and the RCTA-developed RIVET simulation environments are examined with respect to their application in the robotic research domains of Perception, Intelligence, HRI, and DMUM. Tradeoffs for applications to representative problems from each domain are presented, along with known deficiencies and disadvantages. In particular, no single robotic simulation environment adequately covers the needs of the robotic researcher in all of the domains. Simulation for DMUM poses unique constraints on the development of physics-based computational models of the robot, the environment and objects within the environment, and the interactions between them. Most current robot simulations focus on quasi-static systems, but dynamic robotic motion places an increased emphasis on the accuracy of the computational models. In order to understand the interaction of dynamic multi-body systems, such as limbed robots, with the environment, it may be necessary to build component
Cragg, Liza; Williams, Siân; van der Molen, Thys; Thomas, Mike; Correia de Sousa, Jaime; Chavannes, Niels H
There is growing awareness amongst healthcare planners, providers and researchers of the need to make better use of routinely collected health data by translating it into actionable information that improves efficiency of healthcare and patient outcomes. There is also increased acceptance of the importance of real world research that recruits patients representative of primary care populations and evaluates interventions realistically delivered by primary care professionals. The UNLOCK Group is an international collaboration of primary care researchers and practitioners from 15 countries. It has coordinated and shared datasets of diagnostic and prognostic variables for COPD and asthma to answer research questions meaningful to professionals working in primary care over a 6-year period. Over this time the UNLOCK Group has undertaken several studies using data from unselected primary care populations from diverse contexts to evaluate the burden of disease, multiple morbidities, treatment and follow-up. However, practical and structural constraints have hampered the UNLOCK Group's ability to translate research ideas into studies. This study explored the constraints, challenges and successes experienced by the UNLOCK Group and its participants' learning as researchers and primary care practitioners collaborating to answer primary care research questions. The study identified lessons for future studies and collaborations that require data sharing across borders. It also explored specific challenges to fostering the exchange of primary care data in comparison to other datasets such as public health, prescribing or hospital data and mechanisms that may be used to overcome these.
Gold, A. U.; Harris, S. E.
The greenhouse effect comes up in most discussions about climate and is a key concept related to climate change. Existing studies have shown that students and adults alike lack a detailed understanding of this important concept or might hold misconceptions. We studied the effectiveness of different interventions on University-level students' understanding of the greenhouse effect. Introductory level science students were tested for their pre-knowledge of the greenhouse effect using validated multiple-choice questions, short answers and concept sketches. All students participated in a common lesson about the greenhouse effect and were then randomly assigned to one of two lab groups. One group explored an existing simulation about the greenhouse effect (PhET-lesson) and the other group worked with absorption spectra of different greenhouse gases (Data-lesson) to deepen the understanding of the greenhouse effect. All students completed the same assessment including multiple choice, short answers and concept sketches after participation in their lab lesson. 164 students completed all the assessments, 76 completed the PhET lesson and 77 completed the data lesson. 11 students missed the contrasting lesson. In this presentation we show the comparison between the multiple-choice questions, short answer questions and the concept sketches of students. We explore how well each of these assessment types represents student's knowledge. We also identify items that are indicators of the level of understanding of the greenhouse effect as measured in correspondence of student answers to an expert mental model and expert responses. Preliminary data analysis shows that student who produce concept sketch drawings that come close to expert drawings also choose correct multiple-choice answers. However, correct multiple-choice answers are not necessarily an indicator that a student produces an expert-like correlating concept sketch items. Multiple-choice questions that require detailed
This article explores the role of drawing as a tool for reflection. It reports on a PhD research project that aims to identify and analyse the value that co-design processes can bring to participants and their communities. The research is associated with Leapfrog, a three-year project funded by the UK Arts and Humanities Research Council (AHRC).…
In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…
Research Tools can be found in TTC's Available Technologies and in scientific publications. They are freely available to non-profits and universities through a Material Transfer Agreement (or other appropriate mechanism), and available via licensing to companies.
Nathalie Sonck; Henk Fernee
Smartphones and apps offer an innovative means of collecting data from the public. The Netherlands Institute for Social Research | SCP has been engaged in one of the first experiments involving the use of a smartphone app to collect time use data recorded by means of an electronic diary. Is it feasible to use smartphones as a data collection tool for social research? What are the effects on data quality? Can we also incorporate reality mining tools in the smartphone app to replace traditional...
Price, Geoffrey P.; Wright, Vivian H.
Using John Creswell's Research Process Cycle as a framework, this article describes various web-based collaborative technologies useful for enhancing the organization and efficiency of educational research. Visualization tools (Cacoo) assist researchers in identifying a research problem. Resource storage tools (Delicious, Mendeley, EasyBib)…
Michael D. Coovert
Full Text Available Serious games are an attractive tool for education and training, but their utility is even broader. We argue serious games provide a unique opportunity for research as well, particularly in areas where multiple players (groups or teams are involved. In our paper we provide background in several substantive areas. First, we outline major constructs and challenges found in team research. Secondly, we discuss serious games, providing an overview and description of their role in education, training, and research. Thirdly, we describe necessary characteristics for game engines utilized in team research, followed by a discussion of the value added by utilizing serious games. Our goal in this paper is to argue serious games are an effective tool with demonstrated reliability and validity and should be part of a research program for those engaged in team research. Both team researchers and those involved in serious game development can benefit from a mutual partnership which is research focused.
Schou, Lone; Høstrup, Helle; Lyngsø, Elin
schou l., høstrup h., lyngsø e.e., larsen s. & poulsen i. (2011) Validation of a new assessment tool for qualitative research articles. Journal of Advanced Nursing00(0), 000-000. doi: 10.1111/j.1365-2648.2011.05898.x ABSTRACT: Aim. This paper presents the development and validation of a new...... assessment tool for qualitative research articles, which could assess trustworthiness of qualitative research articles as defined by Guba and at the same time aid clinicians in their assessment. Background. There are more than 100 sets of proposals for quality criteria for qualitative research. However, we...... is the Danish acronym for Appraisal of Qualitative Studies. Phase 1 was to develop the tool based on a literature review and on consultation with qualitative researchers. Phase 2 was an inter-rater reliability test in which 40 health professionals participated. Phase 3 was an inter-rater reliability test among...
Zhenyu, Zhao; Yongquan, Zhou; Houming, Zhou; Xiaomei, Xu; Haibin, Xiao
High speed machining technology can improve the processing efficiency and precision, but also reduce the processing cost. Therefore, the technology is widely regarded in the industry. With the extensive application of high-speed machining technology, high-speed tool system has higher and higher requirements on the tool chuck. At present, in high speed precision machining, several new kinds of clip heads are as long as there are heat shrinkage tool-holder, high-precision spring chuck, hydraulic tool-holder, and the three-rib deformation chuck. Among them, the heat shrinkage tool-holder has the advantages of high precision, high clamping force, high bending rigidity and dynamic balance, etc., which are widely used. Therefore, it is of great significance to research the new requirements of the machining tool system. In order to adapt to the requirement of high speed machining precision machining technology, this paper expounds the common tool holder technology of high precision machining, and proposes how to select correctly tool clamping system in practice. The characteristics and existing problems are analyzed in the tool clamping system.
Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.
Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.
Gerhardus, A; Schilling, I; Voss, M
Public health education aims at enabling students to deal with complex health-related challenges using appropriate methods based on sound theoretical understanding. Virtually all health-related problems in science and practice require the involvement of different disciplines. However, the necessary interdisciplinarity is only partly reflected in the curricula of public health courses. Also theories, methods, health topics, and their application are often taught side-by-side and not together. For students, it can become an insurmountable challenge to integrate the different disciplines ("horizontal integration") and theories, methods, health topics, and their application ("vertical integration"). This situation is specific for education in public health but is representative for other interdisciplinary fields as well. Several approaches are available to achieve the horizontal integration of different disciplines and vertical integration of theories, methods, health topics, and their application. A curriculum that is structured by topics, rather than disciplines might be more successful in integrating different disciplines. Vertical integration can be achieved by research-based learning. Research-based learning places a student-led research project at the centre of teaching. Students choose a topic and a research question, raise their own questions for theories and methods and will hopefully cross the seeming chasm between science and practice. Challenges of research-based learning are enhanced demands on students, teachers and curriculum design. © Georg Thieme Verlag KG Stuttgart · New York.
A Group has been set up by the CSNI to identify and review the issues which hinder closer co-operation on research between regulators and industry, and to propose possible ways for resolving such issues while maintaining regulatory independence in decision-making. The Group has analyzed the potential advantages and disadvantages of regulator-industry collaboration in safety research and has also provided indications on how to overcome possible difficulties that can arise from such collaboration. The Group focused in particular on the issue of regulator independence, on means to preserve it and ways to demonstrate it to the public while undertaking collaboration with industry
Petersen, Karen Bjerg
- teacher, learner or curriculum planner positions - result in different strategies or 'answers to modernity'. The research has taken place as a study of e-learning and virtual teachhing of Danish as a second language for adults. The fact that relations in virtual learning are established between physically......, locationally distant'. Based on a case study and interviews with e-learning teachers and learner participants in a virtual classroom setting and on extracts of the curriculum developed for the particular e-learning course, the aim of the paper is to discuss how different positions in an e-learning triangle...... absent individuals, who are locationnaly distant and may never meet, seems to necessitate different strategies towards e-learning, depending on the position in the learning triangle. The research results indicate, that teachers compensate for the disembedded social relations in e-llearning environments...
Ebrahim, Nader Ale
“Research Tools” can be defined as vehicles that broadly facilitate research and related activities. “Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 800 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated...
The CPTAC program develops new approaches to elucidate aspects of the molecular complexity of cancer made from large-scale proteogenomic datasets, and advance them toward precision medicine. Part of the CPTAC mission is to make data and tools available and accessible to the greater research community to accelerate the discovery process.
Park, Hee Dae; Kim, C. K.; Kim, K. H.
KINS has examined the application for licensing of research reactor fuel fabrication for seven months, from May to Dec. 2000. The most hot issues during examination, in order to understand whether the design and facilities are fitted to the regulation criteria or not, were the availability of basic ground, design criteria on safety, availability and methodology of design, seismic criteria, availability of nuclear fuel fabrication, safety related criticality, safety related the process, availability of nuclear waste management, validity of organization and procedure for radioactivity management, and the validity of both selection and analysis about predicted accident. Moreover, another issues such as the radioactivity inspection plan for waste treatment, effect on both radioactive material and accidant, method of decrease of damage on environment, and environmental inspection plan of radioactivity, were severely examined
Benito Ramírez Valverde y Adrián González Romo
Full Text Available In Mexico, the great majority of the coffee producers ―mainly indigenous― live in poverty conditions. During the last years because of the coffee crisis, the conditions of poverty and marginalization are accentuated. Due this situation, the peasants look for alternatives for family survival, considering migration as an alternative. The purpose of this research is to analyze the existing relation between poverty, coffee production and migration, and the impact in the peasants family in three indigenous municipalities. For this research, 49 peasants of the municipalities of “San Felipe Tepatlán”, “Amixtlán and Hueytamalco” in “Sierra Norte” of “Puebla” were interviewed. The results show that the studied municipalities are under high conditions of marginalization. The level of schooling of the inhabitants is low and illiteracy is alarming. The producers of the region are small farm owners and the majority was organized (56.8 % and a great part (66.7% sell their coffee in the community. Results indicate that those who leave the community looking for jobs are mainly the sons and daughters; their main reason for migrating is their poor economic situation. In relation to the international migration, it was found that the number of people crossing the United States border looking for better is still low (4.7% thus, the future is predictable, if the crisis in the agricultural sector continues a greater number of inhabitants of rural area will emigrate towards the neighboring country, as an alternative to the rural crisis.
Dubosarsky, Mia D.
How do young children view science? Do these views reflect cultural stereotypes? When do these views develop? These fundamental questions in the field of science education have rarely been studied with the population of preschool children. One main reason is the lack of an appropriate research instrument that addresses preschool children's developmental competencies. Extensive body of research has pointed at the significance of early childhood experiences in developing positive attitudes and interests toward learning in general and the learning of science in particular. Theoretical and empirical research suggests that stereotypical views of science may be replaced by authentic views following inquiry science experience. However, no preschool science intervention program could be designed without a reliable instrument that provides baseline information about preschool children's current views of science. The current study presents preschool children's views of science as gathered from a pioneering research tool. This tool, in the form of a computer "game," does not require reading, writing, or expressive language skills and is operated by the children. The program engages children in several simple tasks involving picture recognition and yes/no answers in order to reveal their views about science. The study was conducted with 120 preschool children in two phases and found that by the age of 4 years, participants possess an emergent concept of science. Gender and school differences were detected. Findings from this interdisciplinary study will contribute to the fields of early childhood, science education, learning technologies, program evaluation, and early childhood curriculum development.
Schell, Scott R
Surgical research is dependent upon information technologies. Selection of the computer, operating system, and software tool that best support the surgical investigator's needs requires careful planning before research commences. This manuscript presents a brief tutorial on how surgical investigators can best select these information technologies, with comparisons and recommendations between existing systems, software, and solutions. Privacy concerns, based upon HIPAA and other regulations, now require careful proactive attention to avoid legal penalties, civil litigation, and financial loss. Security issues are included as part of the discussions related to selection and application of information technology. This material was derived from a segment of the Association for Academic Surgery's Fundamentals of Surgical Research course.
Winters, J. M.; Jungblut, D.; Catena, A. N.; Rubenstein, D. I.
Providing rigorous academic supplement to a professional development program for teachers, QUEST is a fusion of Drexel University's environmental science research department with Princeton University's Program in Teacher Preparation. Completed in the summers of 2012 (in partnership with Earthwatch) and 2013 in Barnegat Bay, New Jersey, QUEST's terrapin field research program enhances K-12 teachers' ecological knowledge, develops inquiry-based thinking in the classroom, and builds citizen science engagement. With a focus on quality question development and data analysis to answer questions, teachers are coached in developing, implementing, and presenting independent research projects on diamondback terrapin nesting ecology. As a result, teachers participating in QUEST's week long program bring a realistic example of science in action into their classrooms, helping to develop their own students' critical thinking skills. For teachers, this program provides training towards educating students on how to do real and imaginative science - subsequently sending students to university better prepared to engage in their own independent research. An essential component of the collaboration through QUEST, in addition to the teacher's experience during and after the summer institute, is the research data collected which supplements that of the Principal Investigator. In 2012, by documenting terrapin nest site predators, teachers gained valuable scientific experience, while Drexel acquired important ecological data which would have not been able to be collected otherwise. In 2013, teachers helped answer important questions about terrapin nesting success post Superstorm Sandy. In fact, the 2013 QUEST teachers are the first to visualize the frighteningly increased erosion of a primary terrapin nesting site due to Sandy; showing how most terrapin nests now lie in the bay, instead of safe on shore. Teachers comment that interacting with scientists in the field, and contributing to
Rodriguez, W. J.; Chaudhury, S. R.
Undergraduate research projects that utilize remote sensing satellite instrument data to investigate atmospheric phenomena pose many challenges. A significant challenge is processing large amounts of multi-dimensional data. Remote sensing data initially requires mining; filtering of undesirable spectral, instrumental, or environmental features; and subsequently sorting and reformatting to files for easy and quick access. The data must then be transformed according to the needs of the investigation(s) and displayed for interpretation. These multidimensional datasets require views that can range from two-dimensional plots to multivariable-multidimensional scientific visualizations with animations. Science undergraduate students generally find these data processing tasks daunting. Generally, researchers are required to fully understand the intricacies of the dataset and write computer programs or rely on commercially available software, which may not be trivial to use. In the time that undergraduate researchers have available for their research projects, learning the data formats, programming languages, and/or visualization packages is impractical. When dealing with large multi-dimensional data sets appropriate Scientific Visualization tools are imperative in allowing students to have a meaningful and pleasant research experience, while producing valuable scientific research results. The BEST Lab at Norfolk State University has been creating tools for multivariable-multidimensional analysis of Earth Science data. EzSAGE and SAGE4D have been developed to sort, analyze and visualize SAGE II (Stratospheric Aerosol and Gas Experiment) data with ease. Three- and four-dimensional visualizations in interactive environments can be produced. EzSAGE provides atmospheric slices in three-dimensions where the researcher can change the scales in the three-dimensions, color tables and degree of smoothing interactively to focus on particular phenomena. SAGE4D provides a navigable
Kolbæk, Raymond; Steensgaard, Randi; Angel, Sanne
. Furthermore we try to evidence-based the concept of "Sample handlings" and examines whether this concept can be used as a flexible methodological tool for developing workflow that promotes patient participation in their own rehabilitation. We use a action research design to identify actual problems, develop......, to test, evaluate and implement specific actions to promote patient participation in rehabilitation. Four nurses and four social and health assistants is having a "co-researcher" active role. The interaction with the researchers creates a reflexive and dynamic process with a learning and competence......Abstract Content: Major challenges occurs, when trying to implement research in clinical practice. In the West Danish Center for Spinal Cord Injury, we are doing a practice-based ph.d. project, that involves the practice field's own members as co-researchers. In the management of the project we use...
Full Text Available The new Internet technologies have infiltrated in a stunning way the academic environment, both at individual and at institutional level. Therefore, more and more teachers have started educational blogs, librarians are active on Twitter, other educational actors curate web content, students post on Instagram or Flickr, and university departments have Facebook pages and/or YouTube accounts etc. Today, the use of web technology has become “a legitimate activity in many areas of higher education” (Waycott, 2010 and a considerable shift to digital academic research has gradually occurred. Teachers are encouraging students to take up digital tools for research and writing, thus revealing new ways of using information and communication technologies for academic purposes and not just for socializing. The main objective of this paper is to investigate the effects of integrating diverse digital, Web 2.0 tools and resources and OERs/MOOCs in research and in the construction of students’ academic texts. We aim to stress the increasing influence of digital and online tools in academic research and writing. Teachers, specialists, and students alike are affected by this process. In order to show how, we explore the following issues: What is Research 2.0? Which digital/online tools have we used to assist our students? What are the challenges for academic research using digital / web 2.0 tools? And how do digital tools shape academic research?
Full Text Available Abstract Background One of the consequences of the rapid and widespread adoption of high-throughput experimental technologies is an exponential increase of the amount of data produced by genome-wide experiments. Researchers increasingly need to handle very large volumes of heterogeneous data, including both the data generated by their own experiments and the data retrieved from publicly available repositories of genomic knowledge. Integration, exploration, manipulation and interpretation of data and information therefore need to become as automated as possible, since their scale and breadth are, in general, beyond the limits of what individual researchers and the basic data management tools in normal use can handle. This paper describes Genephony, a tool we are developing to address these challenges. Results We describe how Genephony can be used to manage large datesets of genomic information, integrating them with existing knowledge repositories. We illustrate its functionalities with an example of a complex annotation task, in which a set of SNPs coming from a genotyping experiment is annotated with genes known to be associated to a phenotype of interest. We show how, thanks to the modular architecture of Genephony and its user-friendly interface, this task can be performed in a few simple steps. Conclusion Genephony is an online tool for the manipulation of large datasets of genomic information. It can be used as a browser for genomic data, as a high-throughput annotation tool, and as a knowledge discovery tool. It is designed to be easy to use, flexible and extensible. Its knowledge management engine provides fine-grained control over individual data elements, as well as efficient operations on large datasets.
Full Text Available Many researchers collect online survey data because it is cost-effective and less time-consuming than traditional research methods. This paper describes Twitter chats as a research tool vis-à-vis two other online research methods: providing links to electronic surveys to respondents and use of commercially available survey panels through vendors with readily available respondents. Similar to a face-to-face focus group, Twitter chats provide a synchronous environment for participants to answer a structured series of questions and to respond to both the chat facilitator and each other. This paper also reports representative responses from a Twitter chat that explored financial decisions of young adults. The chat was sponsored by a multi-state group of land-grant university researchers, in cooperation with WiseBread, a personal finance website targeted to millennials, to recruit respondents for a more extensive month-long online survey about the financial decisions of young adults. The Twitter chat responses suggest that student loans were the top concern of participants, and debt and housing rounded out the top three concerns. The internet, both websites and social media, was the most frequently cited source of financial information. The article concludes with a discussion of lessons learned from the Twitter chat experience and suggestions for professional practice.
Waterlander, Wilma E; Scarpa, Michael; Lentz, Daisy; Steenhuis, Ingrid H M
Economic interventions in the food environment are expected to effectively promote healthier food choices. However, before introducing them on a large scale, it is important to gain insight into the effectiveness of economic interventions and peoples' genuine reactions to price changes. Nonetheless, because of complex implementation issues, studies on price interventions are virtually non-existent. This is especially true for experiments undertaken in a retail setting. We have developed a research tool to study the effects of retail price interventions in a virtual-reality setting: the Virtual Supermarket. This paper aims to inform researchers about the features and utilization of this new software application. The Virtual Supermarket is a Dutch-developed three-dimensional software application in which study participants can shop in a manner comparable to a real supermarket. The tool can be used to study several food pricing and labelling strategies. The application base can be used to build future extensions and could be translated into, for example, an English-language version. The Virtual Supermarket contains a front-end which is seen by the participants, and a back-end that enables researchers to easily manipulate research conditions. The application keeps track of time spent shopping, number of products purchased, shopping budget, total expenditures and answers on configurable questionnaires. All data is digitally stored and automatically sent to a web server. A pilot study among Dutch consumers (n = 66) revealed that the application accurately collected and stored all data. Results from participant feedback revealed that 83% of the respondents considered the Virtual Supermarket easy to understand and 79% found that their virtual grocery purchases resembled their regular groceries. The Virtual Supermarket is an innovative research tool with a great potential to assist in gaining insight into food purchasing behaviour. The application can be obtained via an URL
Steenhuis Ingrid HM
Full Text Available Abstract Background Economic interventions in the food environment are expected to effectively promote healthier food choices. However, before introducing them on a large scale, it is important to gain insight into the effectiveness of economic interventions and peoples' genuine reactions to price changes. Nonetheless, because of complex implementation issues, studies on price interventions are virtually non-existent. This is especially true for experiments undertaken in a retail setting. We have developed a research tool to study the effects of retail price interventions in a virtual-reality setting: the Virtual Supermarket. This paper aims to inform researchers about the features and utilization of this new software application. Results The Virtual Supermarket is a Dutch-developed three-dimensional software application in which study participants can shop in a manner comparable to a real supermarket. The tool can be used to study several food pricing and labelling strategies. The application base can be used to build future extensions and could be translated into, for example, an English-language version. The Virtual Supermarket contains a front-end which is seen by the participants, and a back-end that enables researchers to easily manipulate research conditions. The application keeps track of time spent shopping, number of products purchased, shopping budget, total expenditures and answers on configurable questionnaires. All data is digitally stored and automatically sent to a web server. A pilot study among Dutch consumers (n = 66 revealed that the application accurately collected and stored all data. Results from participant feedback revealed that 83% of the respondents considered the Virtual Supermarket easy to understand and 79% found that their virtual grocery purchases resembled their regular groceries. Conclusions The Virtual Supermarket is an innovative research tool with a great potential to assist in gaining insight into food
Background Economic interventions in the food environment are expected to effectively promote healthier food choices. However, before introducing them on a large scale, it is important to gain insight into the effectiveness of economic interventions and peoples' genuine reactions to price changes. Nonetheless, because of complex implementation issues, studies on price interventions are virtually non-existent. This is especially true for experiments undertaken in a retail setting. We have developed a research tool to study the effects of retail price interventions in a virtual-reality setting: the Virtual Supermarket. This paper aims to inform researchers about the features and utilization of this new software application. Results The Virtual Supermarket is a Dutch-developed three-dimensional software application in which study participants can shop in a manner comparable to a real supermarket. The tool can be used to study several food pricing and labelling strategies. The application base can be used to build future extensions and could be translated into, for example, an English-language version. The Virtual Supermarket contains a front-end which is seen by the participants, and a back-end that enables researchers to easily manipulate research conditions. The application keeps track of time spent shopping, number of products purchased, shopping budget, total expenditures and answers on configurable questionnaires. All data is digitally stored and automatically sent to a web server. A pilot study among Dutch consumers (n = 66) revealed that the application accurately collected and stored all data. Results from participant feedback revealed that 83% of the respondents considered the Virtual Supermarket easy to understand and 79% found that their virtual grocery purchases resembled their regular groceries. Conclusions The Virtual Supermarket is an innovative research tool with a great potential to assist in gaining insight into food purchasing behaviour. The
Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.
Science and society would be well advised to develop a different relationship as the information revolution penetrates all aspects of modern life. Rather than produce clear answers to clear questions in a top-down manner, land-use issues related to the UN Sustainable Development Goals (SDGs) present "wicked"problems involving different, strongly opiniated, stakeholders with conflicting ideas and interests and risk-averse politicians. The Dutch government has invited its citizens to develop a "science agenda", defining future research needs, implicitly suggesting that the research community is unable to do so. Time, therefore, for a pro-active approach to more convincingly define our:"societal license to research". For soil science this could imply a focus on the SDGs , considering soils as living, characteristically different, dynamic bodies in a landscape, to be mapped in ways that allow generation of suitable modelling data. Models allow a dynamic characterization of water- and nutrient regimes and plant growth in soils both for actual and future conditions, reflecting e.g. effects of climate or land-use change or alternative management practices. Engaging modern stakeholders in a bottom-up manner implies continuous involvement and "joint learning" from project initiation to completion, where modelling results act as building blocks to explore alternative scenarios. Modern techniques allow very rapid calculations and innovative visualization. Everything is possible but only modelling can articulate the economic, social and environmental consequences of each scenario, demonstrating in a pro-active manner the crucial and indispensible role of research. But choices are to be made by stakeholders and reluctant policy makers and certainly not by scientists who should carefully guard their independance. Only clear results in the end are convincing proof for the impact of science, requiring therefore continued involvement of scientists up to the very end of projects. To
Ocepek, Melissa G.; Westbrook, Lynn
Online information seekers make heavy use of websites that accept their natural language questions. This study compared the three types of such websites: social question and answer (Q&A), digital reference services, and ask-an-expert services. Questions reflecting daily life, research, and crisis situations were posed to high use websites of all three types. The resulting answers' characteristics were analyzed in terms of speed, transparency, formality, and intimacy. The results indicate that social Q&A websites excel in speed, ask-an-expert websites in intimacy, and digital reference services in transparency and formality.
Valeria Gisela Perez
Full Text Available This paper develops a reflection about the importance of the research of accounting subjects in the professional accountants training, this importance is an attribute of research to increase the wealth of discipline under investigation, this can be converted into a skill and/or competence wich accountants are required to demonstrate in their professional practice.Furthermore, accounting is recognized by the authors as a science in constant development, being able to be investigated. This change in knowledge is an element that motivates professionals to be constantly updated, becoming this aspect (constant updating the skill and competence that research can bring to professional training in university classrooms.The reflection is based on the study of documents developed by prestigious authors in accounting theory, teaching and research.Therefore, this paper concludes that research is a useful tool for the professional accounting training, and rewards the important skills and competencies for professional practice; it can be conceived as well as a strategy for technical and educational activities that allows students to recreate knowledge, allowing future updates that will require their professional practice.Key words: Accounting research, university teaching, accounting education.
Weingart, R.C.; Chau, H.H.; Goosman, D.R.; Hofer, W.W.; Honodel, C.A.; Lee, R.S.; Steinberg, D.J.; Stroud, J.R.
We have developed a new tool for ultrahigh-pressure research at LLL. This system, which we call the electric gun, has already achieved thin flyer plate velocities in excess of 20 km/s and pressures of the order of 2 TPa in tantalum. We believe that the electric gun is competitive with laser- and nuclear-driven methods of producing shocks in the 1-to-5 TPa range because of its precision and ease and economy of operation. Its development is recommended for shock initiation studies, dry runs for Site 300 hydroshots, and as a shock wave generator for surface studies
Rabies: Questions and Answers Information about the disease and vaccines What causes rabies? Rabies is caused by a virus. The virus invades ... nervous system and disrupts its functioning. How does rabies spread? The rabies virus is transmitted in the ...
Experts of Food and Agriculture Organization (FAO)/ International Atomic Energy Agency (IAEA)/ World Health Organization (WHO) committee obtained their conclusion in 1980 that food irradiated with <10 kGy of radiation is safe for human health, which is now globally approved. However, in Japan, there have been still opposite opinions based on the doubt in the title on the safety of irradiated food. In this paper, the author answers those questions as he was a member to arrange the Research in the title for food irradiation. Described are data presentation and explanation about results of toxicity studies of diets added with irradiated materials of: weight reductions in rat ovary by irradiated potato (ip) in chronic studies, and in mouse testicle and ovary of F3 generation from the ancestor mice kept on diet with irradiated onion (io); bone malformation in mice by io; and reduction of body weight gain in female rats by ip and increase of mortality of male rats by ip. These are analyzed on the aspects of radiation dose-response, sustained tendency of results throughout the living period or generation, and apparent abnormality by other factors; and normal variation due to individual difference is pointed out to contribute to these findings. The safety test of irradiated food has been conducted valid not only in animal experiments but also other tests like genotoxicity and analysis of radiation-degraded products. (R.T.)
Full Text Available Introduction: This paper describes the development of a ‘Research for Impact’ Tool against a background of concerns about the over-researching of Aboriginal and Torres Strait Islander people’s issues without demonstrable benefits.Material and Methods: A combination of literature reviews, workshops with researchers and reflections by project team members and partners using participatory snowball techniques.Results: Assessing research impact is difficult, akin to so-called ‘wicked problem’, but not impossible. Heuristic and collaborative approach to research that takes in the expectations of research users, those being researched and the funders of research offers a pragmatic solution to evaluating research impact. The proposed ‘Research for Impact’ Tool is based on the understanding that the value of research is to create evidence and/or products to support smarter decisions so as to improve the human condition.Research is of limited value unless the evidence produced is used to inform smarter decisions. A practical way of approaching research impact is therefore to start with the decisions confronting decision makers whether they are government policymakers, professional practitioners or households and the extent to which the research supports smarter decisions and the knock-on consequences of such smart decisions. Embedded at each step in the impact planning, monitoring and evaluation process is the need for Indigenous leadership and participation, capacity enhancement and collaborative partnerships and participatory learning by doing approaches across partners.Discussion: The tool is designed in the context of Indigenous research but the basic idea that the way to assess research impact is to start upfront by defining the users’ of research and their information needs, the decisions confronting them and the extent to which research informs smarter decisions is equally applicable to research in other settings, both applied and
Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.
Roeder, L.; Jundt, R.
Sponsored by the Department of Energy, the ARM Climate Research Facility is a global scientific user facility for the study of climate change. To publicize progress and achievements and to reach new users, the ACRF uses a variety of Web 2.0 tools and strategies that build off of the program’s comprehensive and well established News Center (www.arm.gov/news). These strategies include: an RSS subscription service for specific news categories; an email “newsletter” distribution to the user community that compiles the latest News Center updates into a short summary with links; and a Facebook page that pulls information from the News Center and links to relevant information in other online venues, including those of our collaborators. The ACRF also interacts with users through field campaign blogs, like Discovery Channel’s EarthLive, to share research experiences from the field. Increasingly, field campaign Wikis are established to help ACRF researchers collaborate during the planning and implementation phases of their field studies and include easy to use logs and image libraries to help record the campaigns. This vital reference information is used in developing outreach material that is shared in highlights, news, and Facebook. Other Web 2.0 tools that ACRF uses include Google Maps to help users visualize facility locations and aircraft flight patterns. Easy-to-use comment boxes are also available on many of the data-related web pages on www.arm.gov to encourage feedback. To provide additional opportunities for increased interaction with the public and user community, future Web 2.0 plans under consideration for ACRF include: evaluating field campaigns for Twitter and microblogging opportunities, adding public discussion forums to research highlight web pages, moving existing photos into albums on FlickR or Facebook, and building online video archives through YouTube.
Hartley, D.S. III
This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.
Software Tools for Battery Design Software Tools for Battery Design Under the Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) project, NREL has developed software tools to help using CAEBAT software tools. Knowledge of the interplay of multi-physics at varied scales is imperative
Kaczmarczyk, Lech; Jackson, Walker S
The humble house mouse has long been a workhorse model system in biomedical research. The technology for introducing site-specific genome modifications led to Nobel Prizes for its pioneers and opened a new era of mouse genetics. However, this technology was very time-consuming and technically demanding. As a result, many investigators continued to employ easier genome manipulation methods, though resulting models can suffer from overlooked or underestimated consequences. Another breakthrough, invaluable for the molecular dissection of disease mechanisms, was the invention of high-throughput methods to measure the expression of a plethora of genes in parallel. However, the use of samples containing material from multiple cell types could obfuscate data, and thus interpretations. In this review we highlight some important issues in experimental approaches using mouse models for biomedical research. We then discuss recent technological advances in mouse genetics that are revolutionising human disease research. Mouse genomes are now easily manipulated at precise locations thanks to guided endonucleases, such as transcription activator-like effector nucleases (TALENs) or the CRISPR/Cas9 system, both also having the potential to turn the dream of human gene therapy into reality. Newly developed methods of cell type-specific isolation of transcriptomes from crude tissue homogenates, followed by detection with next generation sequencing (NGS), are vastly improving gene regulation studies. Taken together, these amazing tools simplify the creation of much more accurate mouse models of human disease, and enable the extraction of hitherto unobtainable data.
This document, prepared in February 1993, addresses the most common questions asked by APS Collaborative Access Teams (CATs). The answers represent the best judgment on the part of the APS at this time. In some cases, details are provided in separate documents to be supplied by the APS. Some of the answers are brief because details are not yet available. The questions are separated into five categories representing different aspects of CAT interactions with the APS: (1) Memorandum of Understanding (MOU), (2) CAT Beamline Review and Construction, (3) CAT Beamline Safety, (4) CAT Beamline Operations, and (5) Miscellaneous. The APS plans to generate similar documents as needed to both address new questions and clarify answers to present questions
Kim, Jong-Won; Kim, Dogyun
Dosimetry tools for proton therapy research have been developed to measure the properties of a therapeutic proton beam. A CCD camera-scintillation screen system, which can verify the 2D dose distribution of a scanning beam and can be used for proton radiography, was developed. Also developed were a large area parallel-plate ionization chamber and a multi-layer Faraday cup to monitor the beam current and to measure the beam energy, respectively. To investigate the feasibility of locating the distal dose falloff in real time during patient treatment, a prompt gamma measuring system composed of multi-layer shielding structures was then devised. The system worked well for a pristine proton beam. However, correlation between the distal dose falloff and the prompt gamma distribution was blurred by neutron background for a therapy beam formed by scattering method. We have also worked on the design of a Compton camera to image the 2D distribution of prompt gamma rays.
Leggett, Graham J
Continued progress in the fast-growing field of nanoplasmonics will require the development of new methods for the fabrication of metal nanostructures. Optical lithography provides a continually expanding tool box. Two-photon processes, as demonstrated by Shukla et al. (doi: 10.1021/nn103015g), enable the fabrication of gold nanostructures encapsulated in dielectric material in a simple, direct process and offer the prospect of three-dimensional fabrication. At higher resolution, scanning probe techniques enable nanoparticle particle placement by localized oxidation, and near-field sintering of nanoparticulate films enables direct writing of nanowires. Direct laser "printing" of single gold nanoparticles offers a remarkable capability for the controlled fabrication of model structures for fundamental studies, particle-by-particle. Optical methods continue to provide a powerful support for research into metamaterials.
Alexander, Serena; Poggo, Tammy
Features the complete set of answers to the exercises in Mathematics Year 5, to save you time marking work and enable you to identify areas requiring further attention. The book includes diagrams and workings where necessary, to ensure pupils understand how to present their answers. Also available from Galore Park www.galorepark.co.uk :. - Mathematics Year 5. - Mathematics Year 6. - 11+ Maths Practice Exercises. - 11+ Maths Revision Guide. - 10-Minute Maths Tests Workbook Age 8-10. - 10-Minute Maths Tests Workbook Age 9-11. - Mental Arithmetic Workbook Age 8-10. - Mental Arithmetic Workbook Ag
This book constitutes the refereed proceedings of the 10th International Conference on Flexible Query Answering Systems, FQAS 2013, held in Granada, Spain, in September 2013. The 59 full papers included in this volume were carefully reviewed and selected from numerous submissions. The papers...... are organized in a general session train and a parallel special session track. The general session train covers the following topics: querying-answering systems; semantic technology; patterns and classification; personalization and recommender systems; searching and ranking; and Web and human...
Full Text Available Large-scale genome projects have generated a rapidly increasing number of DNA sequences. Therefore, development of computational methods to rapidly analyze these sequences is essential for progress in genomic research. Here we present an automatic annotation system for preliminary analysis of DNA sequences. The gene annotation tool (GATO is a Bioinformatics pipeline designed to facilitate routine functional annotation and easy access to annotated genes. It was designed in view of the frequent need of genomic researchers to access data pertaining to a common set of genes. In the GATO system, annotation is generated by querying some of the Web-accessible resources and the information is stored in a local database, which keeps a record of all previous annotation results. GATO may be accessed from everywhere through the internet or may be run locally if a large number of sequences are going to be annotated. It is implemented in PHP and Perl and may be run on any suitable Web server. Usually, installation and application of annotation systems require experience and are time consuming, but GATO is simple and practical, allowing anyone with basic skills in informatics to access it without any special training. GATO can be downloaded at [http://mariwork.iq.usp.br/gato/]. Minimum computer free space required is 2 MB.
This book constitutes the refereed proceedings of the 12th International Conference on Flexible Query Answering Systems, FQAS 2017, held in London, UK, in June 2017. The 21 full papers presented in this book together with 4 short papers were carefully reviewed and selected from 43 submissions...
Brose, Sascha; Danylyuk, Serhiy; Tempeler, Jenny; Kim, Hyun-su; Loosen, Peter; Juschkin, Larissa
In this work we present the capabilities of the designed and realized extreme ultraviolet laboratory exposure tool (EUVLET) which has been developed at the RWTH-Aachen, Chair for the Technology of Optical Systems (TOS), in cooperation with the Fraunhofer Institute for Laser Technology (ILT) and Bruker ASC GmbH. Main purpose of this laboratory setup is the direct application in research facilities and companies with small batch production, where the fabrication of high resolution periodic arrays over large areas is required. The setup can also be utilized for resist characterization and evaluation of its pre- and post-exposure processing. The tool utilizes a partially coherent discharge produced plasma (DPP) source and minimizes the number of other critical components to a transmission grating, the photoresist coated wafer and the positioning system for wafer and grating and utilizes the Talbot lithography approach. To identify the limits of this approach first each component is analyzed and optimized separately and relations between these components are identified. The EUV source has been optimized to achieve the best values for spatial and temporal coherence. Phase-shifting and amplitude transmission gratings have been fabricated and exposed. Several commercially available electron beam resists and one EUV resist have been characterized by open frame exposures to determine their contrast under EUV radiation. Cold development procedure has been performed to further increase the resist contrast. By analyzing the exposure results it can be demonstrated that only a 1:1 copy of the mask structure can be fully resolved by the utilization of amplitude masks. The utilized phase-shift masks offer higher 1st order diffraction efficiency and allow a demagnification of the mask structure in the achromatic Talbot plane.
Grace, Stephen C; Embry, Stephen; Luo, Heng
Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization options and supports non
Attali, Yigal; Powers, Don; Freedman, Marshall; Harrison, Marissa; Obetz, Susan
This report describes the development, administration, and scoring of open-ended variants of GRE® Subject Test items in biology and psychology. These questions were administered in a Web-based experiment to registered examinees of the respective Subject Tests. The questions required a short answer of 1-3 sentences, and responses were automatically…
Mora, J.C.; Robles, Beatriz [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas - CIEMAT (Spain); Bradshaw, Clare; Stark, Karolina [Stockholm University (Sweden); Sweeck, Liev; Vives i Batlle, Jordi [Belgian Nuclear Research Centre SCK-CEN (Belgium); Beresford, Nick [Centre for Ecology and Hydrology - CEH (United Kingdom); Thoerring, Havard; Dowdall, Mark [Norwegian Radiation Protection Authority - NRPA (Norway); Outola, Iisa; Turtiainen, Tuukka; Vetikko, Virve [STUK - Radiation and Nuclear Safety Authority (Finland); Steiner, Martin [Federal Office for Radiation Protection - BfS (Germany); Beaugelin-Seiller, Karine; Fevrier, Laureline; Hurtevent, Pierre; Boyer, Patrick [Institut de Radioprotection et de Surete Nucleaire - IRSN (France)
Interaction Matrices as a Tool for Prioritizing Radioecology Research J.C. Mora CIEMAT In 2010 the Strategy for Allied Radioecology (STAR) was launched with several objectives aimed towards integrating the radioecology research efforts of nine institutions in Europe. One of these objectives was the creation of European Radioecology Observatories. The Chernobyl Exclusion Zone (CEZ) and the Upper Silesian Coal Basin (USCB), a coal mining area in Poland, have been chosen after a selection process. A second objective was to develop a system for improving and validating the capabilities of predicting the behaviour of the main radionuclides existing at these observatories. Interaction Matrices (IM) have been used since the 1990's as a tool for developing ecological conceptual models and have also been used within radioecology. The Interaction Matrix system relies on expert judgement for structuring knowledge of a given ecosystem at the conceptual level and was selected for use in the STAR project. A group of experts, selected from each institution of STAR, designed two matrices with the main compartments for each ecosystem (a forest in CEZ and a lake in USCB). All the features, events and processes (FEPs) which could affect the behaviour of the considered radionuclides, focusing on radiocaesium in the Chernobyl forest and radium in the Rontok-Wielki lake, were also included in each IM. Two new sets of experts were appointed to review, improve and prioritize the processes included in each IM. A first processing of the various candidate interaction matrices produced a single interaction matrix for each ecosystem which incorporated all experts combined knowledge. During the prioritization of processes in the IMs, directed towards developing a whole predictive model of radionuclides behaviour in those ecosystems, raised interesting issues related to the processes and parameters involved, regarding the existing knowledge in them. This exercise revealed several processes
Millar, A. Z.; Perry, S.
Interns in the Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) program conduct computer science research for the benefit of earthquake scientists and have created products in growing use within the SCEC education and research communities. SCEC/UseIT comprises some twenty undergraduates who combine their varied talents and academic backgrounds to achieve a Grand Challenge that is formulated around needs of SCEC scientists and educators and that reflects the value SCEC places on the integration of computer science and the geosciences. In meeting the challenge, students learn to work on multidisciplinary teams and to tackle complex problems with no guaranteed solutions. Meantime, their efforts bring fresh perspectives and insight to the professionals with whom they collaborate, and consistently produces innovative, useful tools for research and education. The 2007 Grand Challenge was to design and prototype serious games to communicate important earthquake science concepts. Interns broke themselves into four game teams, the Educational Game, the Training Game, the Mitigation Game and the Decision-Making Game, and created four diverse games with topics from elementary plate tectonics to earthquake risk mitigation, with intended players ranging from elementary students to city planners. The games were designed to be versatile, to accommodate variation in the knowledge base of the player; and extensible, to accommodate future additions. The games are played on a web browser or from within SCEC-VDO (Virtual Display of Objects). SCEC-VDO, also engineered by UseIT interns, is a 4D, interactive, visualization software that enables integration and exploration of datasets and models such as faults, earthquake hypocenters and ruptures, digital elevation models, satellite imagery, global isochrons, and earthquake prediction schemes. SCEC-VDO enables the user to create animated movies during a session, and is now part
Mychasiuk, R; Benzies, K
Facebook is currently one of the world's most visited websites, and home to millions of users who access their accounts on a regular basis. Owing to the website's ease of accessibility and free service, demographic characteristics of users span all domains. As such, Facebook may be a valuable tool for locating and communicating with participants in longitudinal research studies. This article outlines the benefit gained in a longitudinal follow-up study, of an intervention programme for at-risk families, through the use of Facebook as a search engine. Using Facebook as a resource, we were able to locate 19 participants that were otherwise 'lost' to follow-up, decreasing attrition in our study by 16%. Additionally, analysis indicated that hard-to-reach participants located with Facebook differed significantly on measures of receptive language and self-esteem when compared to their easier-to-locate counterparts. These results suggest that Facebook is an effective means of improving participant retention in a longitudinal intervention study and may help improve study validity by reaching participants that contribute differing results. © 2011 Blackwell Publishing Ltd.
... Cancer Prevention Genetics of Breast & Gynecologic Cancers Breast Cancer Screening Research Dense Breasts: Answers to Commonly Asked Questions What are dense breasts? Breasts contain glandular, connective, and fat tissue. Breast density is a term that describes the ...
The correct answers to the Staff Association Competition are: How many women delegates are there currently in the Staff Council? -14 Who is the current President of the Staff Association? - Alessandro Raimondo Which year was the Nursery School established by the Staff Association at CERN? -1965 How many CERN clubs are supported by the Staff Association? -44 What is the supreme representative body of the Staff Association ? -The Staff Council The winners will be informed by email.
Moore, John W.
porous medium. Still another derivation is required (and a different result is obtained) when two gases diffuse into a third gas from opposite ends of a constant-volume container. Graham's law is typically demonstrated either by diffusion through a porous frit or by diffusion of HCl and NH3 into air from opposite ends of a glass tube. Neither of these involves experimental conditions that satisfy the assumptions of the derivation presented in most texts. Also misleading are demonstrations in which perfume released in one place becomes detectable throughout a room, or a crystal of KMnO4 dissolves and disperses to form a uniformly colored aqueous solution. Unless special precautions are taken, dispersion of a substance into a fluid depends more on convection than on diffusion (4). In one case a study of student misconceptions about diffusion was based on the researcher's misconception that dispersion of a dye in water during a period of only a few minutes was an illustration of diffusion (5). If we can convince ourselves that we have accurately determined an expected result or demonstrated a principle, even though the experiment or demonstration should not give that result, then most students are also likely to be convinced. It is important that they learn that skepticism and courteous, rational discourse are important components of scientific progress. Persistent misconceptions such as the two described above provide a golden opportunity to involve students in such discourse. We could, for example, demonstrate both a method that works and one that does not, compare results, and ask students to suggest additional experiments that might resolve the issue. (Steel wool and 0.25 M acetic acid can be used to achieve a reproducible and reasonably accurate determination of the fraction of oxygen in air . Davis  reports that rates of diffusion in an agar gel, which minimizes convection, are essentially the same as in water, which provides a way of showing how slow diffusion
Analysis Tools NREL developed the following modeling, simulation, and analysis tools to investigate novel design goals (e.g., fuel economy versus performance) to find cost-competitive solutions. ADOPT Vehicle Simulator to analyze the performance and fuel economy of conventional and advanced light- and
Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states，and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.
Rector, Travis A.; Vogt, Nicole P.
Spectroscopy is one of the most powerful tools that astronomers use to study the universe. However relatively few resources are available that enable undergraduates to explore astronomical spectra interactively. We present web-based applications which guide students through the analysis of real spectra of stars, galaxies, and quasars. The tools are written in HTML5 and function in all modern web browsers on computers and tablets. No software needs to be installed nor do any datasets need to be downloaded, enabling students to use the tools in or outside of class (e.g., for online classes).Approachable GUIs allow students to analyze spectra in the same manner as professional astronomers. The stellar spectroscopy tool can fit a continuum with a blackbody and identify spectral features, as well as fit line profiles and determine equivalent widths. The galaxy and AGN tools can also measure redshifts and calcium break strengths. The tools provide access to an archive of hundreds of spectra obtained with the optical telescopes at Kitt Peak National Observatory. It is also possible to load your own spectra or to query the Sloan Digital Sky Survey (SDSS) database.We have also developed curricula to investigate these topics: spectral classification, variable stars, redshift, and AGN classification. We will present the functionality of the tools and describe the associated curriculum. The tools are part of the General Education Astronomy Source (GEAS) project based at New Mexico State University, with support from the National Science Foundation (NSF, AST-0349155) and the National Aeronautics and Space Administration (NASA, NNX09AV36G). Curriculum development was supported by the NSF (DUE-0618849 and DUE-0920293).
Full Text Available In this paper, the test-bench for sonic logging tool is proposed and designed to realize automatic calibration and testing of the sonic logging tool. The test-bench System consists of Host Computer, Embedded Controlling Board, and functional boards. The Host Computer serves as the Human Machine Interface (HMI and processes uploaded data. The software running on Host Computer is designed on VC++, which is developed based on multithreading, Dynamic Linkable Library (DLL and Multiple Document Interface (MDI techniques. The Embedded Controlling Board uses ARM7 as the microcontroller and communicates with Host Computer via Ethernet. The Embedded Controlling Board software is realized based on embedded uclinux operating system with a layered architecture. The functional boards are designed based on Field Programmable Gate Array (FPGA and provide test interfaces for the logging tool. The functional board software is divided into independent sub-modules that can repeatedly be used by various functional boards and then integrated those sub-modules in the top layer. With the layered architecture and modularized design, the software system is highly reliable and extensible. With the help of designed system, a test has been conducted quickly and successfully on the electronic receiving cabin of the sonic logging tool. It demonstrated that the system could greatly improve the production efficiency of the sonic logging tool.
Bruun Larsen, Lars; Skonnord, Trygve; Gjelstad, Svein
in primary care research. Examples of this are online randomisation, electronic questionnaires, automatic email scheduling, mobile phone applications and data extraction tools. The amount of data can be increased to a low cost, and this can help to reach adequate sample sizes. However, there are still...... challenges within the field. To secure a high response rate, you need to follow up manually or use another application. There are also practical and ethical problems, and the data security for sensitive data have to be followed carefully. Session content Oral presentations about some technological...
... Vaccines, Blood & Biologics Animal & Veterinary Cosmetics Tobacco Products Drugs Home Drugs Resources for You Information for Consumers (Drugs) Questions & Answers Generic Drugs: Questions & Answers Share Tweet Linkedin Pin it More ...
Li, Xiaoyan; Croft, W. B
.... Specifically, we explore the use of question-answering techniques for novelty detection. New information is defined as new/previously unseen answers to questions representing a user's information need...
Dec 29, 2008 ... It is on this premise that this article presents Bayes' theorem as a vital tool. A brief intuitive ... diseased individual will be selected or that a disease-free individual will be selected? ...... Ultrasound physics and. Instruction 3rd ed ...
Kirkpatrick, CJ; Otto, M; van Kooten, T; Krump, [No Value; Kriegsmann, J; Bittinger, F
Progress in biocompatibility and tissue engineering would today be inconceivable without the aid of in vitro techniques. Endothelial cell cultures represent a valuable tool not just in haemocompatibility testing, but also in the concept of designing hybrid organs. In the past endothelial cells (EC)
The present knowledge society requires statistical literacy-the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002). However, research shows that students generally do not gain satisfactory statistical understanding. The research
The types and uses of research reactors are reviewed. After an analysis of the world situation, the demand of new research reactors of about 20 MW is foreseen. The experience and competitiveness of INVAP S.E. as designer and constructor of research reactors is outlined and the general specifications of the reactors designed by INVAP for Egypt and Australia are given
Veeck, Ann; Hoger, Beth
Knowledge of how to effectively monitor social media is an increasingly valued marketing research skill. This study tests an approach for adding social media content to an undergraduate marketing research class team project. The revised project maintains the expected objectives and parameters of a traditional research project, while integrating…
Full Text Available This contribution describes ‘Research Game’, a game produced in a Lifelong Learning Programme-Comenius Project (The European Scientific Research Game which aims at motivating secondary school students through the experience of the excitement of scientific research. The project proposes practical and didactic works which combine theoretical activities with ICT in order to introduce students to the scientific research. Students collaborated internationally across Europe, to build hypotheses, carry out research, test the validity of their hypothesis and finalize a theory based on their findings. On the project platform (www.researchgame.eu/platform teachers and students registered, created a team, interacted on a forum space, played and learned science in a new innovative way. Here, the students shared their research findings with other groups of all Europe; finally competed online playing a serious game and showing to be able to apply the scientific method.
Full Text Available Question answering system is a system that allows user to state his or her information need in the form of natural language question, and return short text excerpts or even phrases as an answer. The availability of a wide and various information source and improvements in the techniques of natural language processing, information extraction (wrapper, and information retrieval give a big effect on the development of question answering system, from just answering questions in a specific domain by consulting to structured information source such as database, and like in this research, answering any questions based on information stored in an unstructured text collection. A general architecture of question answering system based on text consists of six processing stages, i.e. question analysis, document collection preprocessing, candidate document selection, candidate document analysis, answer extraction, and response generation. Application of question answering system like AnswerBus, Mulder, and Webclopedia that are developed with its own characteristics has similar processing steps as in the general architecture. Answers returned by a question answering system need to be evaluated for performance measure. This research completed with a simple question answering system application using english Bible in World English Bible (WEB version as the source of information to answer some questions. Because specific domain is selected: Bible, questions that can be posed by user could ask about information in the Bible itself only. Question is also limited to three types of answers that can be supported by the application: person (who, location (where, and date (when. Abstract in Bahasa Indonesia : Question answering system (QA system adalah sistem yang mengijinkan user menyatakan kebutuhan informasinya dalam bentuk natural language question (pertanyaan dalam bahasa alami, dan mengembalikan kutipan teks singkat atau bahkan frase sebagai jawaban. Ketersediaan
Abbott, Rodman P.; Stracener, Jerrell
This study investigates the relationship between the designated research project system independent variables of Labor, Travel, Equipment, and Contract total annual costs and the dependent variables of both the associated matching research project total annual academic publication output and thesis/dissertation number output. The Mahalanobis…
Noyons, Everard Christiaan Marie
Bibliometric maps of science are landscapes of scientific research fields created by quantitative analysis of bibliographic data. In such maps the 'cities' are, for instance, research topics. Topics with a strong cognitive relation are in each other's vicinity and topics with a weak relation are
Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox
Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...
Validity is a key concept in qualitative educational research. Yet, it is often not addressed in methodological writing about dance. This essay explores validity in a postmodern world of diverse approaches to scholarship, by looking at the changing face of validity in educational qualitative research and at how new understandings of the concept…
Roo, A.P.J. de; Thielen, J.; Feyen, L.; Burek, P.; Salamon, P.
The floods in the rivers Meuse and Rhine in 1993 and 1995 made the European Commission realize that also at Commission level further research on floods – especially in transboundary river catchments - was necessary. This led to the start of a dedicated research project on floods at the European
Karlsson, Bodil S. A.; Allwood, Carl Martin
The Dress photograph, first displayed on the internet in 2015, revealed stunning individual differences in color perception. The aim of this study was to investigate if lay-persons believed that the question about The Dress colors was answerable. Past research has found that optimism is related to judgments of how answerable knowledge questions with controversial answers are (Karlsson et al., 2016). Furthermore, familiarity with a question can create a feeling of knowing the answer (Reder and...
The objective of this study is to develop an evidencebased research implementation database and tool to support research implementation at the Georgia Department of Transportation (GDOT).A review was conducted drawing from the (1) implementati...
THE FOLLOWING IS ONE OF A SERIES OF PAPERS DEVELOPED OR PRODUCED BY THE ECONOMIC ANALYSIS DIVISION OF THE JOHN A. VOLPE NATIONAL TRANSPORTATION SYSTEMS CENTER AS PART OF ITS RESEARCH PROJECT LOOKING INTO ISSUES SURROUNDING : USER RESPONSE AND MARKET ...
Ivancic, William D.
Personnel in the NASA Glenn Research Center Network and Architectures branch have performed a variety of research related to space-based sensor webs, network centric operations, security and delay tolerant networking (DTN). Quality documentation and communications, real-time monitoring and information dissemination are critical in order to perform quality research while maintaining low cost and utilizing multiple remote systems. This has been accomplished using a variety of Internet technologies often operating simultaneously. This paper describes important features of various technologies and provides a number of real-world examples of how combining Internet technologies can enable a virtual team to act efficiently as one unit to perform advanced research in operational systems. Finally, real and potential abuses of power and manipulation of information and information access is addressed.
Klein, P. D.; Hachey, D. L.; Kreek, M. J.; Schoeller, D. A.
Recent developments in the use of the stable isotopes, /sup 13/C, /sup 15/N, /sup 17/O, and /sup 18/O, as tracers in research studies in the fields of biology, medicine, pharmacology, and agriculture are briefly reviewed. (CH)
Hartrumpf, Sven; Glöckner, Ingo; Leveling, Johannes
The German question answering (QA) system IRSAW (formerly: InSicht) participated in QA@CLEF for the fth time. IRSAW was introduced in 2007 by integrating the deep answer producer InSicht, several shallow answer producers, and a logical validator. InSicht builds on a deep QA approach: it transforms documents to semantic representations using a parser, draws inferences on semantic representations with rules, and matches semantic representations derived from questions and documents. InS...
Massi, Luciana; Santos, Gelson Ribeiro dos; Ferreira, Jerino Queiroz; Queiroz, Salete Linhares
Chemistry teachers increasingly use research articles in their undergraduate courses. This trend arises from current pedagogical emphasis on active learning and scientific process. In this paper, we describe some educational experiences on the use of research articles in chemistry higher education. Additionally, we present our own conclusions on the use of such methodology applied to a scientific communication course offered to undergraduate chemistry students at the University of São Paulo, ...
-computer interaction. The overall theme of the FQAS conferences is innovative query systems aimed at providing easy, flexible, and intuitive access to information. Such systems are intended to facilitate retrieval from information repositories such as databases, libraries, and the World-Wide Web. These repositories......This volume constitutes the proceedings of the Seventh International Conference on Flexible Query Answering Systems, FQAS 2006, held in Milan, Italy, on June 7--10, 2006. FQAS is the premier conference for researchers and practitioners concerned with the vital task of providing easy, flexible...... are typically equipped with standard query systems which are often inadequate, and the focus of FQAS is the development of query systems that are more expressive, informative, cooperative, and productive. These proceedings contain contributions from invited speakers and 53 original papers out of about 100...
Hakim, Toufic M.; Garg, Shila
The National Science Foundation's 1996 report "Shaping the Future: New Expectations for Undergraduate Education in Science, Mathematics, Engineering and Technology" urged that in order to improve SME&T education, decisive action must be taken so that "all students have access to excellent undergraduate education in science .... and all students learn these subjects by direct experience with the methods and processes of inquiry." Research-related educational activities that integrate education and research have been shown to be valuable in improving the quality of education and enhancing the number of majors in physics departments. Student researchers develop a motivation to continue in science and engineering through an appreciation of how science is done and the excitement of doing frontier research. We will address some of the challenges of integrating research into the physics undergraduate curriculum effectively. The departmental and institutional policies and infrastructure required to help prepare students for this endeavor will be discussed as well as sources of support and the establishment of appropriate evaluation procedures.
Otrel-Cass, Kathrin; Cowie, Bronwen
When practising teachers take time to exchange their experiences and reflect on their teaching realities as critical friends, they add meaning and depth to educational research. When peer talk is facilitated through video chat platforms, teachers can meet (virtually) face to face even when...... recordings were transcribed and used to prompt further discussion. The recording of the video chat meetings provided an opportunity for researchers to listen in and follow up on points they felt needed further unpacking or clarification. The recorded peer video chat conversations provided an additional...... opportunity to stimulate and support teacher participants in a process of critical analysis and reflection on practice. The discussions themselves were empowering because in the absence of the researcher, the teachers, in negotiation with peers, choose what is important enough to them to take time to discuss....
Villanti, Andrea C; Feirman, Shari P; Niaura, Raymond S; Pearson, Jennifer L; Glasser, Allison M; Collins, Lauren K; Abrams, David B
To propose a hierarchy of methodological criteria to consider when determining whether a study provides sufficient information to answer the question of whether e-cigarettes can facilitate cigarette smoking cessation or reduction. A PubMed search to 1 February 2017 was conducted of all studies related to e-cigarettes and smoking cessation or reduction. Australia, Europe, Iran, Korea, New Zealand and the United States. 91 articles. Coders organized studies according to six proposed methodological criteria: (1) examines outcome of interest (cigarette abstinence or reduction), (2) assesses e-cigarette use for cessation as exposure of interest, (3) employs appropriate control/comparison groups, (4) ensures that measurement of exposure precedes the outcome, (5) evaluates dose and duration of the exposure and (6) evaluates the type and quality of the e-cigarette used. Twenty-four papers did not examine the outcomes of interest. Forty did not assess the specific reason for e-cigarette use as an exposure of interest. Twenty papers did not employ prospective study designs with appropriate comparison groups. The few observational studies meeting some of the criteria (duration, type, use for cessation) triangulated with findings from three randomized trials to suggest that e-cigarettes can help adult smokers quit or reduce cigarette smoking. Only a small proportion of studies seeking to address the effect of e-cigarettes on smoking cessation or reduction meet a set of proposed quality standards. Those that do are consistent with randomized controlled trial evidence in suggesting that e-cigarettes can help with smoking cessation or reduction. © 2017 Society for the Study of Addiction.
Full Text Available There are many ways in which pain in animals can be measured and these are based on a variety of phenomena that are related to either the perception of pain or alterations in physical or behavioural features of the animal that are caused by that pain. The features of pain that are most useful for assessment in clinical environments are not always the best to use in a research environment. This is because the aims and objectives of the two settings are different and so whilst particular techniques will have the same advantages and disadvantages in clinical and research environments, these considerations may become more or less of a drawback when moving from one environment to the other. For example, a simple descriptive pain scale has a number of advantages and disadvantages. In a clinical setting the advantages are very useful and the disadvantages are less relevant, but in a research environment the advantages are less important and the disadvantages can become more problematic. This paper will focus on pain in the research environment and after a brief revision of the pathophysiological systems involved will attempt to outline the major advantages and disadvantages of the more commonly used measurement techniques that have been used for studies in the area of pain perception and analgesia. This paper is expanded from a conference proceedings paper presented at the International Veterinary Emergency and Critical Care Conference in San Diego, USA.
Ramalho, A.J.G.; Marques, J.G.; Cardeira, F.M.
A short presentation is made of the Portuguese Research Reactor utilisation, its problems and the solutions found. Starting with the initial calibration and experiments the routine operation at full power follows. The problems then encountered which drove to the refurbishment are referred. The present status of the system is then presented and from that conclusions for the future are derived. (author)
Nathalie Sonck; Henk Fernee
Smartphones and apps offer an innovative means of collecting data from the public. The Netherlands Institute for Social Research | SCP has been engaged in one of the first experiments involving the use of a smartphone app to collect time use data recorded by means of an electronic diary. Is it
Ainley, Mary; Bourke, Valerie; Chatfield, Robert; Hillman, Kylie; Watkins, Ian
In 1997, Balwyn High School (Australia) instituted a class of 28 Year 7 students to use laptop computers across the curriculum. This report details findings from an action research project that monitored important aspects of what happened when this program was introduced. A range of measures was developed to assess the influence of the use of…
Brownell, Marni D.; Jutte, Douglas P.
Linking administrative data records for the same individuals across services and over time offers a powerful, population-wide resource for child maltreatment research that can be used to identify risk and protective factors and to examine outcomes. Multistage de-identification processes have been developed to protect privacy and maintain…
Baran, Evrim; Chuang, Hsueh-Hua; Thompson, Ann
TPACK (technological pedagogical content knowledge) has emerged as a clear and useful construct for researchers working to understand technology integration in learning and teaching. Whereas first generation TPACK work focused upon explaining and interpreting the construct, TPACK has now entered a second generation where the focus is upon using…
Netjes, M.; Vanderfeesten, I.T.P.; Reijers, H.A.; Bussler, C.; Haller, A.
Although much attention is being paid to business processes during the past decades, the design of business processes and particularly workflow processes is still more art than science. In this workshop paper, we present our view on modeling methods for workflow processes and introduce our research
Hu, Ze; Zhang, Zhan; Yang, Haiqin; Chen, Qing; Zuo, Decheng
Recently, online health expert question-answering (HQA) services (systems) have attracted more and more health consumers to ask health-related questions everywhere at any time due to the convenience and effectiveness. However, the quality of answers in existing HQA systems varies in different situations. It is significant to provide effective tools to automatically determine the quality of the answers. Two main characteristics in HQA systems raise the difficulties of classification: (1) physicians' answers in an HQA system are usually written in short text, which yields the data sparsity issue; (2) HQA systems apply the quality control mechanism, which refrains the wisdom of crowd. The important information, such as the best answer and the number of users' votes, is missing. To tackle these issues, we prepare the first HQA research data set labeled by three medical experts in 90days and formulate the problem of predicting the quality of answers in the system as a classification task. We not only incorporate the standard textual feature of answers, but also introduce a set of unique non-textual features, i.e., the popular used surface linguistic features and the novel social features, from other modalities. A multimodal deep belief network (DBN)-based learning framework is then proposed to learn the high-level hidden semantic representations of answers from both textual features and non-textual features while the learned joint representation is fed into popular classifiers to determine the quality of answers. Finally, we conduct extensive experiments to demonstrate the effectiveness of including the non-textual features and the proposed multimodal deep learning framework. Copyright © 2017 Elsevier Inc. All rights reserved.
Yu, T.; Lu, R.; Bishop, L.
Biofilm processes are widely utilized in environmental engineering for biodegradation of contaminated waters, gases and soils. It is important to understand the structure and functions of biofilms. Microelectrodes are novel experimental tools for environmental biofilm studies. The authors reviewed the techniques of oxygen, sulfide, redox potential and pH microelectrode. These microelectrodes have tip diameters of 3 to 20 μm, resulting a high spatial resolution. They enable us directly measure the chemical conditions as results of microbial activities in biofilms. The authors also reported the laboratory and field studies of wastewater biofilms using microelectrode techniques. The results of these studies provided experimental evidence on the stratification of microbial processes and the associated redox potential change in wastewater biofilms: (1) The oxygen penetration depth was only a fraction of the biofilm thickness. This observation, first made under laboratory conditions, has been confirmed under field conditions. (2) The biofilms with both aerobic oxidation and sulfate reduction had a clearly stratified structure. This was evidenced by a sharp decrease of redox potential near the interface between the aerobic zone and the sulfate reduction zone within the biofilm. In this type of biofilms, aerobic oxidation took place only in a shallow layer near the biofilm surface and sulfate reduction occurred in the deeper anoxic zone. (3) The redox potential changed with the shift of primary microbial process in biofilms, indicating that it is possible to use redox potential to help illustrate the structure and functions of biofilms. (author)
Open inquiry through reproducing results is fundamental to the scientific process. Contemporary research relies on software engineering pipelines to collect, process, and analyze data. The open source projects within Project Jupyter facilitate these objectives by bringing software engineering within the context of scientific communication. We will highlight specific projects that are computational building blocks for scientific communication, starting with the Jupyter Notebook. We will also explore applications of projects that build off of the Notebook such as Binder, JupyterHub, and repo2docker. We will discuss how these projects can individually and jointly improve reproducibility in scientific communication. Finally, we will demonstrate applications of Jupyter software that allow researchers to build upon the code of other scientists, both to extend their work and the work of others. There will be a follow-up demo session in the afternoon, hosted by iML. Details can be foun...
Wright, J; Wagner, A
Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...
Blažková, Michaela; Člověčko, M.; Eltsov, V. B.; Gažo, E.; de Graaf, R.; Hosio, J.J.; Krusius, M.; Schmoranzer, D.; Schoepe, W.; Skrbek, Ladislav; Skyba, P.; Solntsev, R.E.; Vinen, W. F.
Roč. 150, - (2008), s. 525-535 ISSN 0022-2291 R&D Projects: GA ČR GA202/05/0218 Grant - others:GAUK(CZ) 7953/2007; Transnational Access Programme(XE) RITA -CT-2003-505313 Institutional research plan: CEZ:AV0Z10100520 Keywords : normal 3He * superfluid 3He * superfluid 4He * turbulence, * cavitation * quartz tuning fork Subject RIV: BK - Fluid Dynamics Impact factor: 1.034, year: 2008
An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between 'liquid' electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided
This slide presentation reviews the Global Hawk, a unmanned aerial vehicle (UAV) that NASA plans to use for Earth Sciences research. The Global Hawk is the world's first fully autonomous high-altitude, long-endurance aircraft, and is capable of conducting long duration missions. Plans are being made for the use of the aircraft on missions in the Arctic, Pacific and Western Atlantic Oceans. There are slides showing the Global Hawk Operations Center (GHOC), Flight Control and Air Traffic Control Communications Architecture, and Payload Integration and Accommodations on the Global Hawk. The first science campaign, planned for a study of the Pacific Ocean, is reviewed.
Lal, Shalini; Donnelly, Catherine; Shin, Jennifer
Digital storytelling is a method of using storytelling, group work, and modern technology to facilitate the creation of 2-3 minute multi-media video clips to convey personal or community stories. Digital storytelling is being used within the health care field; however, there has been limited documentation of its application within occupational therapy. This paper introduces digital storytelling and proposes how it can be applied in occupational therapy clinical practice, education, and research. The ethical and methodological challenges in relation to using the method are also discussed.
Barr, Yael; Rasbury, Jack; Johnson, Jordan; Barstend, Kristina; Saile, Lynn; Watkins, Sharmi
The Exploration Medical Capability (ExMC) element is one of six elements of the Human Research Program (HRP). ExMC is charged with decreasing the risk of: "Inability to adequately recognize or treat an ill or injured crew member" for exploration-class missions In preparation for exploration-class missions, ExMC has compiled a large evidence base, previously available only to persons within the NASA community. ExMC has developed the "NASA Human Research Wiki" in an effort to make the ExMC information available to the general public and increase collaboration within and outside of NASA. The ExMC evidence base is comprised of several types of data, including: (1)Information on more than 80 medical conditions which could occur during space flight (a)Derived from several sources (b)Including data on incidence and potential outcomes, as captured in the Integrated Medical Model s (IMM) Clinical Finding Forms (CliFFs). (2)Approximately 25 gap reports (a)Identify any "gaps" in knowledge and/or technology that would need to be addressed in order to provide adequate medical support for these novel missions.
Full Text Available Alexander M Walker,1 Amanda R Patrick,2 Michael S Lauer,3 Mark C Hornbrook,4 Matthew G Marin,5 Richard Platt,6 Véronique L Roger,7 Paul Stang,8 Sebastian Schneeweiss21World Health Information Science Consultants, Newton, MA; 2Division of Pharmacoepidemiology and Pharmacoeconomics, Brigham and Women's Hospital, Boston, MA; 3National Heart, Lung, and Blood Institute, National Institutes of Health, Bethesda, MD; 4The Center for Health Research, Kaiser Permanente Northwest, Portland, OR; 5Department of Medicine, New Jersey Medical School, Newark, NJ; 6Department of Population Medicine, Harvard Pilgrim Health Care Institute and Harvard Medical School, Boston, MA; 7Department of Health Sciences Research, Mayo Clinic, Rochester, MN; 8Johnson and Johnson Pharmaceutical Research and Development, Titusville, NJ, USABackground: Comparative effectiveness research (CER provides actionable information for health care decision-making. Randomized clinical trials cannot provide the patients, time horizons, or practice settings needed for all required CER. The need for comparative assessments and the infeasibility of conducting randomized clinical trials in all relevant areas is leading researchers and policy makers to non-randomized, retrospective CER. Such studies are possible when rich data exist on large populations receiving alternative therapies that are used as-if interchangeably in clinical practice. This setting we call “empirical equipoise.”Objectives: This study sought to provide a method for the systematic identification of settings it in which it is empirical equipoise that offers promised non-randomized CER.Methods: We used a standardizing transformation of the propensity score called “preference” to assess pairs of common treatments for uncomplicated community-acquired pneumonia and new-onset heart failure in a population of low-income elderly people in Pennsylvania, for whom we had access to de-identified insurance records. Treatment
Scanlon, J.J.; Rolader, G.E.; Jamison, K.A.; Petresky, H.
Electromagnetic Launcher (EML) research at the Air Force Armament Laboratory, Hypervelocity Launcher Branch (AFATL/SAH), Eglin AFB, has focused on developing the technologies required for repetitively launching several kilogram payloads to high velocities. Previous AFATL/SAH experiments have been limited by the available power supply resulting in small muzzle energies on the order of 100's of kJ. In an effort to advance the development of EML's, AFATL/SAH has designed and constructed a battery power supply (BPS) capable of providing several mega-Amperes of current for several seconds. This system consists of six modules each containing 2288 automotive batteries which may be connected in two different series - parallel arrangements. In this paper the authors define the electrical characteristics of the AFATL Battery Power supply at the component level
Rogers, Jan; SanSoucie, Mike
Containerless processing represents an important topic for materials research in microgravity. Levitated specimens are free from contact with a container, which permits studies of deeply undercooled melts, and high-temperature, highly reactive materials. Containerless processing provides data for studies of thermophysical properties, phase equilibria, metastable state formation, microstructure formation, undercooling, and nucleation. The European Space Agency (ESA) and the German Aerospace Center (DLR) jointly developed an electromagnetic levitator facility (MSL-EML) for containerless materials processing in space. The electrostatic levitator (ESL) facility at the Marshall Space Flight Center provides support for the development of containerless processing studies for the ISS. Apparatus and techniques have been developed to use the ESL to provide data for phase diagram determination, creep resistance, emissivity, specific heat, density/thermal expansion, viscosity, surface tension and triggered nucleation of melts. The capabilities and results from selected ESL-based characterization studies performed at NASA's Marshall Space Flight Center will be presented.
Ion Danut I. JUGANARU
Full Text Available This study aims at analyzing the distribution of tourist flows in 2014, from 25 European countries, on three main categories of trip purposes, and assumes that there are differences or similarities between the tourists’ countries of residence and their trip purposes. "Purpose'' is a multidimensional concept used in marketing research, most often for understanding consumer behavior, and for identifying market segments or customer target groups, reunited in terms of similar characteristics. Being aware that the decision of choice/ purchase is based on purposes, their knowledge proves useful in designing strategies to increase the satisfaction level provided to the customer. The statistical method used in this paper is the factorial correspondences analysis. In our opinion, the identification, by this method, of the existence of differences or similarities between the tourists’ countries of residence and their trip purposes can represent a useful step in studying the tourism market and the choice/ reformulation of strategies.
D R Simmons
Full Text Available A common problem in visual appearance research is how to quantitatively characterise the visual appearance of a region of an image which is categorised by human observers in the same way. An example of this is scarring in medical images (Ayoub et al, 2010, The Cleft-Palate Craniofacial Journal, in press. We have argued that “scarriness” is itself a visual appearance descriptor which summarises the distinctive combination of colour, texture and shape information which allows us to distinguish scarred from non-scarred tissue (Simmons et al, ECVP 2009. Other potential descriptors for other image classes would be “metallic”, “natural”, or “liquid”. Having developed an automatic algorithm to locate scars in medical images, we then tested “ground truth” by asking untrained observers to draw around the region of scarring. The shape and size of the scar on the image was defined by building a contour plot of the agreement between observers' outlines and thresholding at the point above which 50% of the observers agreed: a consensus coding scheme. Based on the variability in the amount of overlap between the scar as defined by the algorithm, and the consensus scar of the observers, we have concluded that the algorithm does not completely capture the putative appearance descriptor “scarriness”. A simultaneous analysis of qualitative descriptions of the scarring by the observers revealed that other image features than those encoded by the algorithm (colour and texture might be important, such as scar boundary shape. This approach to visual appearance research in medical imaging has potential applications in other application areas, such as botany, geology and archaeology.
Verma, Ark; Brysbaert, Marc
Neuropsychological and neuroimaging research has established that knowledge related to tool use and tool recognition is lateralized to the left cerebral hemisphere. Recently, behavioural studies with the visual half-field technique have confirmed the lateralization. A limitation of this research was that different sets of stimuli had to be used for the comparison of tools to other objects and objects to non-objects. Therefore, we developed a new set of stimuli containing matched triplets of tools, other objects and non-objects. With the new stimulus set, we successfully replicated the findings of no visual field advantage for objects in an object recognition task combined with a significant right visual field advantage for tools in a tool recognition task. The set of stimuli is available as supplemental data to this article.
Because the World Wide Web is a dynamic collection of information, the Web search tools (or "search engines") that index the Web are dynamic. Traditional information retrieval evaluation techniques may not provide reliable results when applied to the Web search tools. This study is the result of ten replications of the classic 1996 Ding and Marchionini Web search tool research. It explores the effects that replication can have on transforming unreliable results from one iteration into replica...
Zhongqi Sheng; Lei Zhang; Hualong Xie; Changchun Liu
Assembly is the part that produces the maximum workload and consumed time during product design and manufacturing process. CNC machine tool is the key basic equipment in manufacturing industry and research on assembly design technologies of CNC machine tool has theoretical significance and practical value. This study established a simplified ASRG for CNC machine tool. The connection between parts, semantic information of transmission, and geometric constraint information were quantified to as...
Baggi, Fulvio; Mantegazza, Renato; Antozzi, Carlo; Sanders, Donald
Clinical registries may facilitate research on myasthenia gravis (MG) in several ways: as a source of demographic, clinical, biological, and immunological data on large numbers of patients with this rare disease; as a source of referrals for clinical trials; and by allowing rapid identification of MG patients with specific features. Physician-derived registries have the added advantage of incorporating diagnostic and treatment data that may allow comparison of outcomes from different therapeutic approaches, which can be supplemented with patient self-reported data. We report the demographic analysis of MG patients in two large physician-derived registries, the Duke MG Patient Registry, at the Duke University Medical Center, and the INNCB MG Registry, at the Istituto Neurologico Carlo Besta, as a preliminary study to assess the consistency of the two data sets. These registries share a common structure, with an inner core of common data elements (CDE) that facilitate data analysis. The CDEs are concordant with the MG-specific CDEs developed under the National Institute of Neurological Disorders and Stroke Common Data Elements Project. © 2012 New York Academy of Sciences.
Full Text Available Despite neuroblastoma being the most common extracranial solid cancer in childhood, it is still a rare disease. Consequently, the unavailability of tissue for research limits the statistical power of studies. Pathology archives are possible sources of rare tissue, which, if proven to remain consistent over time, could prove useful to research of rare disease types. We applied immunohistochemistry to investigate whether long term storage caused any changes to antigens used diagnostically for neuroblastoma. We constructed and quantitatively assessed a tissue microarray containing neuroblastoma archival material dating between 1950 and 2007. A total of 119 neuroblastoma tissue cores were included spanning 6 decades. Fourteen antibodies were screened across the tissue microarray (TMA. These included seven positive neuroblastoma diagnosis markers (NB84, Chromogranin A, NSE, Ki-67, INI1, Neurofilament Protein, Synaptophysin, two anticipated to be negative (S100A, CD99, and five research antibodies (IL-7, IL-7R, JAK1, JAK3, STAT5. The staining of these antibodies was evaluated using Aperio ImageScope software along with novel pattern recognition and quantification algorithms. This analysis demonstrated that marker signal intensity did not decrease over time and that storage for 60 years had little effect on antigenicity. The construction and assessment of this neuroblastoma TMA has demonstrated the feasibility of using archival samples for research.
Hunter, Jill V; Wilde, Elisabeth A; Tong, Karen A; Holshouser, Barbara A
This article identifies emerging neuroimaging measures considered by the inter-agency Pediatric Traumatic Brain Injury (TBI) Neuroimaging Workgroup. This article attempts to address some of the potential uses of more advanced forms of imaging in TBI as well as highlight some of the current considerations and unresolved challenges of using them. We summarize emerging elements likely to gain more widespread use in the coming years, because of 1) their utility in diagnosis, prognosis, and understanding the natural course of degeneration or recovery following TBI, and potential for evaluating treatment strategies; 2) the ability of many centers to acquire these data with scanners and equipment that are readily available in existing clinical and research settings; and 3) advances in software that provide more automated, readily available, and cost-effective analysis methods for large scale data image analysis. These include multi-slice CT, volumetric MRI analysis, susceptibility-weighted imaging (SWI), diffusion tensor imaging (DTI), magnetization transfer imaging (MTI), arterial spin tag labeling (ASL), functional MRI (fMRI), including resting state and connectivity MRI, MR spectroscopy (MRS), and hyperpolarization scanning. However, we also include brief introductions to other specialized forms of advanced imaging that currently do require specialized equipment, for example, single photon emission computed tomography (SPECT), positron emission tomography (PET), encephalography (EEG), and magnetoencephalography (MEG)/magnetic source imaging (MSI). Finally, we identify some of the challenges that users of the emerging imaging CDEs may wish to consider, including quality control, performing multi-site and longitudinal imaging studies, and MR scanning in infants and children.
Full Text Available Abstract Background Policy makers, clinicians and researchers are demonstrating increasing interest in using data linked from multiple sources to support measurement of clinical performance and patient health outcomes. However, the utility of data linkage may be compromised by sub-optimal or incomplete linkage, leading to systematic bias. In this study, we synthesize the evidence identifying participant or population characteristics that can influence the validity and completeness of data linkage and may be associated with systematic bias in reported outcomes. Methods A narrative review, using structured search methods was undertaken. Key words "data linkage" and Mesh term "medical record linkage" were applied to Medline, EMBASE and CINAHL databases between 1991 and 2007. Abstract inclusion criteria were; the article attempted an empirical evaluation of methodological issues relating to data linkage and reported on patient characteristics, the study design included analysis of matched versus unmatched records, and the report was in English. Included articles were grouped thematically according to patient characteristics that were compared between matched and unmatched records. Results The search identified 1810 articles of which 33 (1.8% met inclusion criteria. There was marked heterogeneity in study methods and factors investigated. Characteristics that were unevenly distributed among matched and unmatched records were; age (72% of studies, sex (50% of studies, race (64% of studies, geographical/hospital site (93% of studies, socio-economic status (82% of studies and health status (72% of studies. Conclusion A number of relevant patient or population factors may be associated with incomplete data linkage resulting in systematic bias in reported clinical outcomes. Readers should consider these factors in interpreting the reported results of data linkage studies.
McQuade, Sarah; Davis, Louise; Nash, Christine
Current thinking in coach education advocates mentoring as a development tool to connect theory and practice. However, little empirical evidence exists to evaluate the effectiveness of mentoring as a coach development tool. Business, education, and nursing precede the coaching industry in their mentoring practice, and research findings offered in…
The aim of this paper is to propose a research tool in the field of education--the "metaphorical collage." This tool facilitates the understanding of concepts and processes in education through the analysis of metaphors in collage works that include pictorial images and verbal images. We believe the "metaphorical collage" to be…
Fancy, Steven G.; Pank, Larry F.; Douglas, David C.; Curby, Catherine H.; Garner, Gerald W.; Amstrup, Steven C.; Regelin, Wayne L.
operation, the UHF (ultra-high frequency) signal failed on three of 32 caribou transmitters and 10 of 36 polar bear transmitters.A geographic information system (GIS) incorporating other databases (e.g., land cover, elevation, slope, aspect, hydrology, ice distribution) was used to analyze and display detailed locational and behavioral data collected via satellite. Examples of GIS applications to research projects using satellite telemetry and examples of detailed movement patterns of caribou and polar bears are presented. This report includes documentation for computer software packages for processing Argos data and presents developments, as of March 1987, in transmitter design, data retrieval using a local user terminal, computer software, and sensor development and calibration.
Crossley, Scott A.
This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…
Søndergaard, Lene Vammen; Dagnæs-Hansen, Frederik; Herskin, Mette S
of the extent of welfare assessment in pigs used in biomedical research and to suggest a welfare assessment standard for research facilities based on an exposition of ethological considerations relevant for the welfare of pigs in biomedical research. The tools for porcine welfare assessment presented suggest...
This article proposes a three-part conceptualisation of the use of Facebook in ethnographic research: as a tool, as data and as context. Longitudinal research with young adults at a time of significant change provides many challenges for the ethnographic researcher, such as maintaining channels of communication and high rates of participant…
The overarching goal of the MoDOT Pavement Preservation Research Program, Task 3: Pavement Evaluation Tools Data : Collection Methods was to identify and evaluate methods to rapidly obtain network-level and project-level information relevant to :...
Highlights: → A highly flexible neutronic core simulator was developed. → The tool estimates the static neutron flux, the eigenmodes, and the neutron noise. → The tool was successfully validated via many benchmark cases. → The tool can be used for research and education. → The tool is freely available. - Abstract: This paper deals with the development, validation, and demonstration of an innovative neutronic tool. The novelty of the tool resides in its versatility, since many different systems can be investigated and different kinds of calculations can be performed. More precisely, both critical systems and subcritical systems with an external neutron source can be studied, and static and dynamic cases in the frequency domain (i.e. for stationary fluctuations) can be considered. In addition, the tool has the ability to determine the different eigenfunctions of any nuclear core. For each situation, the static neutron flux, the different eigenmodes and eigenvalues, the first-order neutron noise, and their adjoint functions are estimated, as well as the effective multiplication factor of the system. The main advantages of the tool, which is entirely MatLab based, lie with the robustness of the implemented numerical algorithms, its high portability between different computer platforms and operative systems, and finally its ease of use since no input deck writing is required. The present version of the tool, which is based on two-group diffusion theory, is mostly suited to investigate thermal systems. The definition of both the static and dynamic core configurations directly from the static macroscopic cross-sections and their fluctuations, respectively, makes the tool particularly well suited for research and education. Some of the many benchmark cases used to validate the tool are briefly reported. The static and dynamic capabilities of the tool are also demonstrated for the following configurations: a vibrating control rod, a perturbation traveling upwards
Showcasing exemplars of how various aspects of design research were successfully transitioned into and influenced, design practice, this book features chapters written by eminent international researchers and practitioners from industry on the Impact of Design Research on Industrial Practice. Chapters written by internationally acclaimed researchers of design analyse the findings (guidelines, methods and tools), technologies/products and educational approaches that have been transferred as tools, technologies and people to transform industrial practice of engineering design, whilst the chapters that are written by industrial practitioners describe their experience of how various tools, technologies and training impacted design practice. The main benefit of this book, for educators, researchers and practitioners in (engineering) design, will be access to a comprehensive coverage of case studies of successful transfer of outcomes of design research into practice; as well as guidelines and platforms for successf...
Patrice L. Capers
Full Text Available BACKGROUND: Meta-research can involve manual retrieval and evaluation of research, which is resource intensive. Creation of high throughput methods (e.g., search heuristics, crowdsourcing has improved feasibility of large meta-research questions, but possibly at the cost of accuracy. OBJECTIVE: To evaluate the use of double sampling combined with multiple imputation (DS+MI to address meta-research questions, using as an example adherence of PubMed entries to two simple Consolidated Standards of Reporting Trials (CONSORT guidelines for titles and abstracts. METHODS: For the DS large sample, we retrieved all PubMed entries satisfying the filters: RCT; human; abstract available; and English language (n=322,107. For the DS subsample, we randomly sampled 500 entries from the large sample. The large sample was evaluated with a lower rigor, higher throughput (RLOTHI method using search heuristics, while the subsample was evaluated using a higher rigor, lower throughput (RHITLO human rating method. Multiple imputation of the missing-completely-at-random RHITLO data for the large sample was informed by: RHITLO data from the subsample; RLOTHI data from the large sample; whether a study was an RCT; and country and year of publication. RESULTS: The RHITLO and RLOTHI methods in the subsample largely agreed (phi coefficients: title=1.00, abstract=0.92. Compliance with abstract and title criteria has increased over time, with non-US countries improving more rapidly. DS+MI logistic regression estimates were more precise than subsample estimates (e.g., 95% CI for change in title and abstract compliance by Year: subsample RHITLO 1.050-1.174 vs. DS+MI 1.082-1.151. As evidence of improved accuracy, DS+MI coefficient estimates were closer to RHITLO than the large sample RLOTHI. CONCLUSIONS: Our results support our hypothesis that DS+MI would result in improved precision and accuracy. This method is flexible and may provide a practical way to examine large corpora of
Li, Linda C; Adam, Paul; Townsend, Anne F; Stacey, Dawn; Lacaille, Diane; Cox, Susan; McGowan, Jessie; Tugwell, Peter; Sinclair, Gerri; Ho, Kendall; Backman, Catherine L
Abstract Background People with rheumatoid arthritis (RA) should use DMARDs (disease-modifying anti-rheumatic drugs) within the first three months of symptoms in order to prevent irreversible joint damage. However, recent studies report the delay in DMARD use ranges from 6.5 months to 11.5 months in Canada. While most health service delivery interventions are designed to improve the family physician's ability to refer to a rheumatologist and prescribe treatments, relatively little has been do...
Aguiar, R R; Ambrosio, L A; Sepúlveda-Hermosilla, G; Maracaja-Coutinho, V; Paschoal, A R
This report describes the miRQuest - a novel middleware available in a Web server that allows the end user to do the miRNA research in a user-friendly way. It is known that there are many prediction tools for microRNA (miRNA) identification that use different programming languages and methods to realize this task. It is difficult to understand each tool and apply it to diverse datasets and organisms available for miRNA analysis. miRQuest can easily be used by biologists and researchers with limited experience with bioinformatics. We built it using the middleware architecture on a Web platform for miRNA research that performs two main functions: i) integration of different miRNA prediction tools for miRNA identification in a user-friendly environment; and ii) comparison of these prediction tools. In both cases, the user provides sequences (in FASTA format) as an input set for the analysis and comparisons. All the tools were selected on the basis of a survey of the literature on the available tools for miRNA prediction. As results, three different cases of use of the tools are also described, where one is the miRNA identification analysis in 30 different species. Finally, miRQuest seems to be a novel and useful tool; and it is freely available for both benchmarking and miRNA identification at http://mirquest.integrativebioinformatics.me/.
Boltz, S.; Macdonald, B. D.; Orr, T.; Johnson, W.; Benton, D. J.
Researchers with the National Institute for Occupational Safety and Health are conducting research at a deep, underground metal mine in Idaho to develop improvements in ground control technologies that reduce the effects of dynamic loading on mine workings, thereby decreasing the risk to miners. This research is multifaceted and includes: photogrammetry, microseismic monitoring, geotechnical instrumentation, and numerical modeling. When managing research involving such a wide range of data, understanding how the data relate to each other and to the mining activity quickly becomes a daunting task. In an effort to combine this diverse research data into a single, easy-to-use system, a three-dimensional visualization tool was developed. The tool was created using the Unity3d video gaming engine and includes the mine development entries, production stopes, important geologic structures, and user-input research data. The tool provides the user with a first-person, interactive experience where they are able to walk through the mine as well as navigate the rock mass surrounding the mine to view and interpret the imported data in the context of the mine and as a function of time. The tool was developed using data from a single mine; however, it is intended to be a generic tool that can be easily extended to other mines. For example, a similar visualization tool is being developed for an underground coal mine in Colorado. The ultimate goal is for NIOSH researchers and mine personnel to be able to use the visualization tool to identify trends that may not otherwise be apparent when viewing the data separately. This presentation highlights the features and capabilities of the mine visualization tool and explains how it may be used to more effectively interpret data and reduce the risk of ground fall hazards to underground miners.
Full Text Available The aim of the investigation is to disclose the possible philosophicalconversation with the child.Methods. The author uses general scientific research methods, including observation and interviews, philosophical analysis.Results and scientific novelty. The author reveals the essence of philosophical conversations with the child, calls the main reasons for the extinction of the children’s curiosity, illustrating examples of incorrect behavior of adults to communicate with children. It is recommended how to be responsible for children’s issues. The article discusses the main reasons for the extinction of the children’s curiosity by illustrating examples of an erroneous behaviour of adults in dealing with children. It is shown that if the teacher does not find a systematic way to engage children in the essential discussion, the children most likely will not learn how to contemplate seriously. The author gives detailed guidance how to answer children’s questions.Practical significance. The article may be of interest to parents, teachers, experts in the field of psychology of creativity, post-graduates and organizers of independent activity of students of higher education institutions.
Wight, Evelyn; Gardner, Gene; Harvey, Tony
As a reflection of its growing culture of openness, and in response to the public's need for accurate information about its activities, the U.S. Department of Energy (DOE) Office of the Assistant Secretary for Environmental Restoration and Waste Management (EM) has increased the amount of information available to the public through communication tools such as brochures, fact sheets, and a travelling exhibit with an interactive computer display. Our involvement with this effort has been to design, develop, and critique booklets, brochures, fact sheets and other communication tools for EM. This paper presents an evaluation of the effectiveness of two communication tools we developed: the EM Booklet and the EM Fact Sheets. We measured effectiveness using non-parametric testing. This paper describes DOE's culture change, EM's communication tools and their context within DOE'S new open culture, our research, test methods and results, the significance of our research, and our plans for future research. (author)
Vellinga, P. [Milieuwetenschappen, Vrije Universiteit, Amsterdam (Netherlands); Van Dorland, R. [KNMI, De Bilt (Netherlands); Kabat, P. [Aardsystemen en Klimaatstudies, Wageningen Universiteit, Wageningen (Netherlands)
In some of the previous issues of this magazine (Spil 2007, issue 4 and 5-6, and Spil 2008, issue 1) the authors Labohm, Roersch and Thoenes started a frontal attack of the greenhouse theory and the researchers who report on the state of science in the framework of the IPCC. The author of this article addresses two main questions arising from the above-mentioned authors: (1) Does the use of fossil fuels affect global climate?; and (2) Is the warming of the last 30 years related to the increasing concentrations of greenhouse gases in the atmosphere? [mk]. [Dutch] In enkele vorige afleveringen van dit tijdschrift (Spil 2007, nummers 4 en 5-6, en Spil 2008, nummer 1) hebben de auteurs Labohm, Roersch en Thoenes een frontale aanval ingezet op de broeikastheorie en de onderzoekers die in IPCC-verband verslag doen van de stand van de wetenschap. De auteur van dit artikel gaat in op twee door voornoemde auteurs gestelde hoofdvragen: (1) Heeft het gebruik van fossiele brandstoffen invloed op het wereldklimaat?; en (2) Houdt de opwarming van de laatste dertig jaar verband met de gestegen concentraties van broeikasgassen in de atmosfeer?.
Full Text Available In the context of E-science and open access, visibility and impact of scientific results and data have become important aspects for spreading information to users and to the society in general. The objective of this general trend of the economy is to feed the innovation process and create economic value. In our institute, the French National Research Institute of Science and Technology for Environment and Agriculture, Irstea, the department in charge of scientific and technical information, with the help of other professionals (Scientists, IT professionals, ethics advisors…, has recently developed suitable services for the researchers and for their needs concerning the data management in order to answer European recommendations for open data. This situation has demanded to review the different workflows between databases, to question the organizational aspects between skills, occupations, and departments in the institute. In fact, the data management involves all professionals and researchers to asset their working ways together.
Murawska, Jaclyn M.; Walker, David A.
In this commentary, we offer a set of visual tools that can assist education researchers, especially those in the field of mathematics, in developing cohesiveness from a mixed methods perspective, commencing at a study's research questions and literature review, through its data collection and analysis, and finally to its results. This expounds…
Asselin, Marlene; Moayeri, Maryam
Competency in the new literacies of the Internet is essential for participating in contemporary society. Researchers studying these new literacies are recognizing the limitations of traditional methodological tools and adapting new technologies and new media for use in research. This paper reports our exploration of usability testing software to…
Ferguson, Amanda M; McLean, David; Risko, Evan F
Recent technological advances have given rise to an information-gathering tool unparalleled by any in human history-the Internet. Understanding how access to such a powerful informational tool influences how we think represents an important question for psychological science. In the present investigation we examined the impact of access to the Internet on the metacognitive processes that govern our decisions about what we "know" and "don't know." Results demonstrated that access to the Internet influenced individuals' willingness to volunteer answers, which led to fewer correct answers overall but greater accuracy when an answer was offered. Critically, access to the Internet also influenced feeling-of-knowing, and this accounted for some (but not all) of the effect on willingness to volunteer answers. These findings demonstrate that access to the Internet can influence metacognitive processes, and contribute novel insights into the operation of the transactive memory system formed by people and the Internet. Copyright © 2015 Elsevier Inc. All rights reserved.
Keller, L Robin; Wang, Yitong
For the last 30 years, researchers in risk analysis, decision analysis, and economics have consistently proven that decisionmakers employ different processes for evaluating and combining anticipated and actual losses, gains, delays, and surprises. Although rational models generally prescribe a consistent response, people's heuristic processes will sometimes lead them to be inconsistent in the way they respond to information presented in theoretically equivalent ways. We point out several promising future research directions by listing and detailing a series of answered, partly answered, and unanswered questions. © 2016 Society for Risk Analysis.
Full Text Available The massive growth of and access to information technology (IT has enabled the integration of technology into classrooms. One such integration is the use of WebQuests as an instructional tool in teaching targeted learning activities such as writing abstracts of research articles in English for English as a Foreign Language (EFL learners. In the academic world, writing an abstract of a research paper or final project in English can be challenging for EFL students. This article presents an action research project on the process and outcomes of using a WebQuest designed to help 20 Indonesian university IT students write a research article’s abstract in English. Findings reveal that despite positive feedback, changes need to be made to make the WebQuest a more effective instructional tool for the purpose it was designed.
Philip W. Gassman; Manuel R. Reyes; Colleen H. Green; Jeffrey G. Arnold
The Soil and Water Assessment Tool (SWAT) model is a continuation of nearly 30 years of modeling efforts conducted by the U.S. Department of Agriculture (USDA), Agricultural Research Service. SWAT has gained international acceptance as a robust interdisciplinary watershed modeling tool, as evidenced by international SWAT conferences, hundreds of SWAT-related papers presented at numerous scientific meetings, and dozens of articles published in peer-reviewed journals. The model has also been ad...
Full Text Available The production lines used for manufacturing U-shaped profiles are very complex and they must have high productivity. One of the most important stages of the fabrication process is the cutting-off. This paper presents the experimental research and analysis of the durability of the cutting tools used for cutting-off U-shaped metal steel profiles. The results of this work can be used to predict the durability of the cutting tools.
Casanovas-Rubio, Maria del Mar; Ahearn, Alison; Ramos, Gonzalo; Popo-Ola, Sunday
In principle, the research-teaching nexus should be seen as a two-way link, showing not only ways in which research supports teaching but also ways in which teaching supports research. In reality, the discussion has been limited almost entirely to the first of these practices. This paper presents a case study in which some student field-trip…
Full Text Available To develop and disseminate tools for interactive visualization of HIV cohort data.If a picture is worth a thousand words, then an interactive video, composed of a long string of pictures, can produce an even richer presentation of HIV population dynamics. We developed an HIV cohort data visualization tool using open-source software (R statistical language. The tool requires that the data structure conform to the HIV Cohort Data Exchange Protocol (HICDEP, and our implementation utilized Caribbean, Central and South America network (CCASAnet data.This tool currently presents patient-level data in three classes of plots: (1 Longitudinal plots showing changes in measurements viewed alongside event probability curves allowing for simultaneous inspection of outcomes by relevant patient classes. (2 Bubble plots showing changes in indicators over time allowing for observation of group level dynamics. (3 Heat maps of levels of indicators changing over time allowing for observation of spatial-temporal dynamics. Examples of each class of plot are given using CCASAnet data investigating trends in CD4 count and AIDS at antiretroviral therapy (ART initiation, CD4 trajectories after ART initiation, and mortality.We invite researchers interested in this data visualization effort to use these tools and to suggest new classes of data visualization. We aim to contribute additional shareable tools in the spirit of open scientific collaboration and hope that these tools further the participation in open data standards like HICDEP by the HIV research community.
Zwoll, K.; Mueller, K.D.; Becks, B.; Erven, W.; Sauer, M.
The production of mechanical parts in research centers can be improved by connecting several numerically controlled machine tools to a central process computer via a data link. The CAMAC Serial Highway with its expandable structure yields an economic and flexible system for this purpose. The CAMAC System also facilitates the development of modular components controlling the machine tools itself. A CAMAC installation controlling three different machine tools connected to a central computer (PDP11) via the CAMAC Serial Highway is described. Besides this application, part of the CAMAC hardware and software can also be used for a great variety of scientific experiments
Tahmasebi, Farhad; Pearce, Robert
Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. For efficiency and speed, the tool takes advantage of a function developed in Excels Visual Basic for Applications. The strategic planning process for determining the community Outcomes is also briefly discussed. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples of using the tool are also presented.
Full Text Available Powder brazing filler metals (PBFMs feature a number of comparative advantages. Among others, these include a low energy consumption, an accurate dosage, a good brazeability, a short production time, and a high production efficiency. These filler metals have been used in the aerospace, automobile, and electric appliances industries. The PBFMs are especially suitable for diamond tools bonding, which involves complex workpiece shapes and requires accurate dosage. The recent research of PBFMs for diamond tools is reviewed in this paper. The current applications are discussed. The CuSnTi and Ni-Cr-based PBFMs have been the two commonly used monolayer PBFMs. Thus, the bonding mechanism at the interface between both the monolayer PBFMs and a diamond tool are summarized first. The ways to improve the performance of the monolayer PBFMs for diamond tools are analyzed. Next, a research of PBFMs for impregnated diamond tools is reviewed. The technical problems that urgently need solutions are discussed. Finally, the challenges and opportunities involved with the PBFMs for diamond tools research and development are summarized, and corresponding prospects are suggested.
Wen, Dunwei; Cuzzola, John; Brown, Lorna; Kinshuk
Question answering systems have frequently been explored for educational use. However, their value was somewhat limited due to the quality of the answers returned to the student. Recent question answering (QA) research has started to incorporate deep natural language processing (NLP) in order to improve these answers. However, current NLP…
Fortuna, Cinira Magali; Mesquita, Luana Pinho de; Matumoto, Silvia; Monceau, Gilles
This qualitative study is based on institutional analysis as the methodological theoretical reference with the objective of analyzing researchers' implication during a research-intervention and the interferences caused by this analysis. The study involved researchers from courses in medicine, nursing, and dentistry at two universities and workers from a Regional Health Department in follow-up on the implementation of the Stork Network in São Paulo State, Brazil. The researchers worked together in the intervention and in analysis workshops, supported by an external institutional analysis. Two institutions stood out in the analysis: the research, established mainly with characteristics of neutrality, and management, with Taylorist characteristics. Differences between researchers and difficulties in identifying actions proper to network management and research were some of the interferences that were identified. The study concludes that implication analysis is a powerful tool for such studies.
Liu, PhD, Charles
From planetary movements and the exploration of our solar system to black holes and dark matter, this comprehensive reference simplifies all aspects of astronomy with an approachable question-and-answer format. With chapters broken into various astronomical studiesincluding the universe, galaxies, planets, and space explorationthis fully updated resource is an ideal companion for students, teachers, and amateur astronomers, answering more than 1,000 questions, such as Is the universe infinite? What would happen to you if you fell onto a black hole? What are the basic concepts of Einstein's special theory of relativity? and Who was the first person in space?.
Doménech, Jesús; Genaim, Samir; Johnsen, Einar Broch; Schlatte, Rudolf
In this paper we describe EasyInterface, an open-source toolkit for rapid development of web-based graphical user interfaces (GUIs). This toolkit addresses the need of researchers to make their research prototype tools available to the community, and integrating them in a common environment, rapidly and without being familiar with web programming or GUI libraries in general. If a tool can be executed from a command-line and its output goes to the standard output, then in few minutes one can m...
POP Nicolae Al.
Full Text Available Starting from the meaning of the communication process in marketing, the authors try to identify its role in assuring the continuity of the management process in what concerns the relationships between all the partners of the company, on the long term. An emphasis is made on the role of online communication and its tools in relationship marketing. In order to validate some of the mentioned ideas the authors have chosen to undertake a qualitative marketing research among the managers of some Romanian tourism companies. The qualitative part of the study had as purpose the identification of the main tools which form the basis of the communication with the beneficiaries of the touristic services, of the way in which the companies use the online communication tools for attracting, keeping and developing the long term relationships with their customers in the virtual environment. The following tools have been analyzed: websites, email marketing campaigns, e-newsletters, online advertising, search engines, sponsored links, blogs, RSS feed, social networks, forums, online discussion groups, portals, infomediaries and instant messaging. The chosen investigation method was the selective survey, the research technique - explorative interrogation and the research instrument - semi structured detailed interview, based on a conversation guide. A very important fact is the classification resulted after the respondents were requested to mention the most efficient tools for attracting customers and for maintaining the relationships with them. Although the notoriety of the online marketing tools is high, there are some tools that are known by definition, but are not used at all or are not used correctly; or are not known by definition, but are used in practice. The authors contributed by validating a performing methodology of qualitative research, a study which will open new ways and means for making the online communication tools used for touristic services in
Full Text Available The Question Answer Relationship (QAR strategy equips students with tools to successfully decode and comprehend what they read. An action research project over 18 days with twenty-three kindergarteners adapted exposure to QARâs "In the Book" and "In my Head" categories with similar questions for each of two popular Aesopâs fables. The challenges and outcomes are presented with special emphasis on teacher-preparation, teacher-reflections, and a hands-on, day-by-day project-implementation. An oral pre-test, after reading The Tortoise and the Hare, served as a baseline assessment for student-comprehension levels. The QAR strategy was then explicitly taught, with opportunities to practice the comprehension skills in small and large groups with parental assistance. Students overwhelmingly scored higher on the post-test reading comprehension after the read-aloud of The Jay and the Peacock with some receiving perfect scores.
Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D’Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel
Background International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. Objectives This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. Methods We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants’ comprehension of the study information was measured by using a validated digitised audio questionnaire. Results The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants’ ‘recall’ and ‘understanding’ between first and second visits were statistically significant (F (1,41)=25.38, pmultimedia tool was acceptable and easy to administer among low literacy participants in The Gambia. It also proved to be effective in delivering and sustaining comprehension of study information across a diverse group of participants. Additional research is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings. PMID:25133065
Karlsson, Bodil S A; Allwood, Carl Martin
The Dress photograph, first displayed on the internet in 2015, revealed stunning individual differences in color perception. The aim of this study was to investigate if lay-persons believed that the question about The Dress colors was answerable. Past research has found that optimism is related to judgments of how answerable knowledge questions with controversial answers are (Karlsson et al., 2016). Furthermore, familiarity with a question can create a feeling of knowing the answer (Reder and Ritter, 1992). Building on these findings, 186 participants saw the photo of The Dress and were asked about the correct answer to the question about The Dress' colors (" blue and black," "white and gold," "other, namely…," or "there is no correct answer" ). Choice of the alternative "there is no correct answer" was interpreted as believing the question was not answerable. This answer was chosen more often by optimists and by people who reported they had not seen The Dress before. We also found that among participants who had seen The Dress photo before, 19%, perceived The Dress as "white and gold" but believed that the correct answer was "blue and black ." This, in analogy to previous findings about non-believed memories (Scoboria and Pascal, 2016), shows that people sometimes do not believe the colors they have perceived are correct. Our results suggest that individual differences related to optimism and previous experience may contribute to if the judgment of the individual perception of a photograph is enough to serve as a decision basis for valid conclusions about colors. Further research about color judgments under ambiguous circumstances could benefit from separating individual perceptual experience from beliefs about the correct answer to the color question. Including the option "there is no correct answer " may also be beneficial.
Macdermid, Joy C; Miller, Jordan; Gross, Anita R
Development or synthesis of the best clinical research is in itself insufficient to change practice. Knowledge translation (KT) is an emerging field focused on moving knowledge into practice, which is a non-linear, dynamic process that involves knowledge synthesis, transfer, adoption, implementation, and sustained use. Successful implementation requires using KT strategies based on theory, evidence, and best practice, including tools and processes that engage knowledge developers and knowledge users. Tools can provide instrumental help in implementing evidence. A variety of theoretical frameworks underlie KT and provide guidance on how tools should be developed or implemented. A taxonomy that outlines different purposes for engaging in KT and target audiences can also be useful in developing or implementing tools. Theoretical frameworks that underlie KT typically take different perspectives on KT with differential focus on the characteristics of the knowledge, knowledge users, context/environment, or the cognitive and social processes that are involved in change. Knowledge users include consumers, clinicians, and policymakers. A variety of KT tools have supporting evidence, including: clinical practice guidelines, patient decision aids, and evidence summaries or toolkits. Exemplars are provided of two KT tools to implement best practice in management of neck pain-a clinician implementation guide (toolkit) and a patient decision aid. KT frameworks, taxonomies, clinical expertise, and evidence must be integrated to develop clinical tools that implement best evidence in the management of neck pain.
Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 5. What Can the Answer be? Reciprocal Basis in Two Dimensions and Other Nice Things. V Balakrishnan. Series Article Volume 2 Issue 5 May 1997 pp ... Author Affiliations. V Balakrishnan1. Indian Institute of Technology, Chennai 600 036, India ...
lems is to ask what the answer could possibly be, under the constraints of the given problem. In the first part of this series, this approach is il- lustrated with some examples from elementary vector analysis. Scientific problems are very often first solved by a com- bination of analogy, educated guesswork and elimination.
Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E
A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi
For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques  ; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.
Li, Man; Pickering, Brian W; Smith, Vernon D; Hadzikadic, Mirsad; Gajic, Ognjen; Herasevich, Vitaly
Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR) in complex environments such as intensive care units (ICU). We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and administrative data from heterogeneous sources within the EMR to support research and practice improvement in the ICUs. Examples of intelligent alarms -- "sniffers", administrative reports, decision support and clinical research applications are presented.
Full Text Available Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR in complex environments such as intensive care units (ICU. We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and administrative data from heterogeneous sources within the EMR to support research and practice improvement in the ICUs. Examples of intelligent alarms – “sniffers”, administrative reports, decision support and clinical research applications are presented.
Barroso, Antonio Carlos de Oliveira; Menezes, Mario Olimpio de, E-mail: email@example.com, E-mail: firstname.lastname@example.org [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Grupo de Pesquisa em Gestao do Conhecimento Aplicada a Area Nuclear
Nowadays broad interest in a couple of inter linked subject areas can make the configuration of a research group to be much diversified both in terms of its components and of the binding relationships that glues the group together. That is the case of the research group for knowledge management and its applications to nuclear technology - KMANT at IPEN, a living entity born 7 years ago and that has sustainably attracted new collaborators. This paper describes the strategic planning of the group, its charter and credo, the present components of the group and the diversified nature of their relations with the group and with IPEN. Then the technical competencies and currently research lines (or programs) are described as well as the research projects, and the management scheme of the group. In the sequence the web-based management and collaboration tools are described as well our experience with their use. KMANT have experiment with over 20 systems and software in this area, but we will focus on those aimed at: (a) web-based project management (RedMine, ClockinIT, Who does, PhProjekt and Dotproject); (b) teaching platform (Moodle); (c) mapping and knowledge representation tools (Cmap, Freemind and VUE); (d) Simulation tools (Matlab, Vensim and NetLogo); (e) social network analysis tools (ORA, MultiNet and UciNet); (f) statistical analysis and modeling tools (R and SmartPLS). Special emphasis is given to the coupling of the group permanent activities like graduate courses and regular seminars and how newcomers are selected and trained to be able to enroll the group. A global assessment of the role the management strategy and available tool set for the group performance is presented. (author)
Barroso, Antonio Carlos de Oliveira; Menezes, Mario Olimpio de
Nowadays broad interest in a couple of inter linked subject areas can make the configuration of a research group to be much diversified both in terms of its components and of the binding relationships that glues the group together. That is the case of the research group for knowledge management and its applications to nuclear technology - KMANT at IPEN, a living entity born 7 years ago and that has sustainably attracted new collaborators. This paper describes the strategic planning of the group, its charter and credo, the present components of the group and the diversified nature of their relations with the group and with IPEN. Then the technical competencies and currently research lines (or programs) are described as well as the research projects, and the management scheme of the group. In the sequence the web-based management and collaboration tools are described as well our experience with their use. KMANT have experiment with over 20 systems and software in this area, but we will focus on those aimed at: (a) web-based project management (RedMine, ClockinIT, Who does, PhProjekt and Dotproject); (b) teaching platform (Moodle); (c) mapping and knowledge representation tools (Cmap, Freemind and VUE); (d) Simulation tools (Matlab, Vensim and NetLogo); (e) social network analysis tools (ORA, MultiNet and UciNet); (f) statistical analysis and modeling tools (R and SmartPLS). Special emphasis is given to the coupling of the group permanent activities like graduate courses and regular seminars and how newcomers are selected and trained to be able to enroll the group. A global assessment of the role the management strategy and available tool set for the group performance is presented. (author)
Li, Man; Pickering, Brian W.; Smith, Vernon D.; Hadzikadic, Mirsad; Gajic, Ognjen; Herasevich, Vitaly
Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR) in complex environments such as intensive care units (ICU). We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and adminis...
Man Li; Brian W. Pickering; Vernon D. Smith; Mirsad Hadzikadic; Ognjen Gajic; Vitaly Herasevich
Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR) in complex environments such as intensive care units (ICU). We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and adminis...
James S. Bates
Researchers, educators, and practitioners utilize a range of tools and techniques to obtain data, input, feedback, and information from research participants, program learners, and stakeholders. Ketso is both an array of information gathering techniques and a toolkit (see www.ketso.com). It “can be used in any situation when people come together to share information, learn from each other, make decisions and plan actions” (Tippett & How, 2011, p. 4). The word ketso means “action” in the Sesot...
Harbottle, Jennifer; Strangward, Patrick; Alnuamaani, Catherine; Lawes, Surita; Patel, Sanjai; Prokop, Andreas
The "droso4schools" project aims to introduce the fruit fly "Drosophila" as a powerful modern teaching tool to convey curriculum-relevant specifications in biology lessons. Flies are easy and cheap to breed and have been at the forefront of biology research for a century, providing unique conceptual understanding of biology and…
Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.
Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256
Trexler, Grant Lewis
This dissertation set out to identify effective qualitative and quantitative management tools used by financial officers (CFOs) in carrying out their management functions of planning, decision making, organizing, staffing, communicating, motivating, leading and controlling at a public research university. In addition, impediments to the use of…
Narasimharao, B. Pandu, Ed.; Wright, Elizabeth, Ed.; Prasad, Shashidhara, Ed.; Joshi, Meghana, Ed.
Higher education institutions play a vital role in their surrounding communities. Besides providing a space for enhanced learning opportunities, universities can utilize their resources for social and economic interests. The "Handbook of Research on Science Education and University Outreach as a Tool for Regional Development" is a…
Meyer, R.A.; Tirsell, K.G.; Armantrout, G.A.
Four areas of research that will have significant impact on the further development of γ-ray spectroscopy as an accurate analytical tool are considered. The areas considered are: (1) automation; (2) accurate multigamma ray sources; (3) accuracy of the current and future γ-ray energy scale, and (4) new solid state X and γ-ray detectors
Tahmasebi, Farhad; Pearce, Robert
Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.
The article is a reflection on the use of an oral diary as a qualitative research tool, the role that it played during fieldwork and the methodological issues that emerged. It draws on a small-scale empirical study into primary school teachers' use of group discussion, during which oral diaries were used to explore and document teacher reflective…
Rosen, Yigel, Ed.; Ferrara, Steve, Ed.; Mosharraf, Maryam, Ed.
Education is expanding to include a stronger focus on the practical application of classroom lessons in an effort to prepare the next generation of scholars for a changing world economy centered on collaborative and problem-solving skills for the digital age. "The Handbook of Research on Technology Tools for Real-World Skill Development"…
Roerig, S.; Evers, S.J.T.M.; Krabbendam, L.
The relation between theatre, or drama, and research is not novel which is illustrated by concepts such as role theory, theatre for development, or distancing in drama therapy. In various scientific fields theatre is used as a communicative and/or educative tool, however in the realm of childhood
Pitsch, Karola; Neumann, Alexander; Schnier, Christian; Hermann, Thomas
We suggest that an Augmented Reality (AR) system for coupled interaction partners provides a new tool for linguistic research that allows to manipulate the coparticipants’ real-time perception and action. It encompasses novel facilities for recording heterogeneous sensor-rich data sets to be accessed in parallel with qualitative/manual and quantitative/computational methods.
Human rights education (HRE) aims to achieve a change of mindsets and social attitudes that entails the construction of a culture of respect towards those values it teaches. Although HRE is a recent field of study, its consolidation in Latin America is a fact. During the latest decades several authors have carried out research related to HRE that…
Bacon, Donald R.; Paul, Pallab; Stewart, Kim A.; Mukhopadhyay, Kausiki
Much has been written about the evaluation of faculty research productivity in promotion and tenure decisions, including many articles that seek to determine the rank of various marketing journals. Yet how faculty evaluators combine journal quality, quantity, and author contribution to form judgments of a scholar's performance is unclear. A…
Virtual Globes are a paradigm shift in the way earth sciences are conducted. With these tools, nearly all aspects of earth science can be integrated from field science, to remote sensing, to remote collaborations, to logistical planning, to data archival/retrieval, to PDF paper retriebal, to education and outreach. Here we present an example of how VGs can be fully exploited for field sciences, using research at McCall Glacier, in Arctic Alaska.
Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K
Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.
Sunil Kumar C
Full Text Available With number of students growing each year there is a strong need to automate systems capable of evaluating descriptive answers. Unfortunately, there aren’t many systems capable of performing this task. In this paper, we use a machine learning tool called LightSIDE to accomplish auto evaluation and scoring of descriptive answers. Our experiments are designed to cater to our primary goal of identifying the optimum training sample size so as to get optimum auto scoring. Besides the technical overview and the experiments design, the paper also covers challenges, benefits of the system. We also discussed interdisciplinary areas for future research on this topic.
Seul, M.; Brazil, L.; Castronova, A. M.
CUAHSI Data Services: Tools and Cyberinfrastructure for Water Data Discovery, Research and CollaborationEnabling research surrounding interdisciplinary topics often requires a combination of finding, managing, and analyzing large data sets and models from multiple sources. This challenge has led the National Science Foundation to make strategic investments in developing community data tools and cyberinfrastructure that focus on water data, as it is central need for many of these research topics. CUAHSI (The Consortium of Universities for the Advancement of Hydrologic Science, Inc.) is a non-profit organization funded by the National Science Foundation to aid students, researchers, and educators in using and managing data and models to support research and education in the water sciences. This presentation will focus on open-source CUAHSI-supported tools that enable enhanced data discovery online using advanced searching capabilities and computational analysis run in virtual environments pre-designed for educators and scientists so they can focus their efforts on data analysis rather than IT set-up.
Marty firms offer Information Technology Research reports, analyst calls, conferences, seminars, tools, leadership development, etc. These entities include Gartner, Forrester Research, IDC, The Burton Group, Society for Information Management, 1nfoTech Research, The Corporate Executive Board, and so on. This talk will cover how a number of such services are being used at the Goddard Space Flight Center to improve our IT management practices, workforce skills, approach to innovation, and service delivery. These tools and services are used across the workforce, from the executive leadership to the IT worker. The presentation will cover the types of services each vendor provides and their primary engagement model. The use of these services at other NASA Centers and Headquarters will be included. In addition, I will explain how two of these services are available now to the entire NASA IT workforce through enterprise-wide subscriptions.
Full Text Available Gerald (Gerry Rubin, pioneer in Drosophila genetics, is Founding Director of the HHMI-funded Janelia Research Campus. In this interview, Gerry recounts key events and collaborations that have shaped his unique approach to scientific exploration, decision-making, management and mentorship – an approach that forms the cornerstone of the model adopted at Janelia to tackle problems in interdisciplinary biomedical research. Gerry describes his remarkable journey from newcomer to internationally renowned leader in the fly field, highlighting his contributions to the tools and resources that have helped establish Drosophila as an important model in translational research. Describing himself as a ‘tool builder’, his current focus is on developing approaches for in-depth study of the fly nervous system, in order to understand key principles in neurobiology. Gerry was interviewed by Ross Cagan, Senior Editor of Disease Models & Mechanisms.
's claim by fellow scientists, and (3) demonstrate the utility and value of the research contribution to any interested parties. However, turning an exploratory prototype into a “proper” tool for end-users often entails great effort. Heavyweight mainstream frameworks such as Eclipse do not address...... this issue; their steep learning curves constitute substantial entry barriers to such ecosystems. In this paper, we present the Model Analyzer/Checker (MACH), a stand-alone tool with a command-line interpreter. MACH integrates a set of research prototypes for analyzing UML models. By choosing a simple...... command line interpreter rather than (costly) graphical user interface, we achieved the core goal of quickly deploying research results to a broader audience while keeping the required effort to an absolute minimum. We analyze MACH as a case study of how requirements and constraints in an academic...
Aycock, Dawn M; Clark, Patricia C; Thomas-Seaton, LaTeshia; Lee, Shih-Yu; Moloney, Margaret
Highly organized project management facilitates rigorous study implementation. Research involves gathering large amounts of information that can be overwhelming when organizational strategies are not used. We describe a variety of project management and organizational tools used in different studies that may be particularly useful for novice researchers. The studies were a multisite study of caregivers of stroke survivors, an Internet-based diary study of women with migraines, and a pilot study testing a sleep intervention in mothers of low-birth-weight infants. Project management tools were used to facilitate enrollment, data collection, and access to results. The tools included protocol and eligibility checklists, event calendars, screening and enrollment logs, instrument scoring tables, and data summary sheets. These tools created efficiency, promoted a positive image, minimized errors, and provided researchers with a sense of control. For the studies described, there were no protocol violations, there were minimal missing data, and the integrity of data collection was maintained. © The Author(s) 2016.
cluster-based query expan- sion, learning answering strategies, machine learning in NLP To my wife Monica Abstract During recent years, question...process is typically tedious and involves expertise in crafting and implement- ing these models (e.g. rule-based), utilizing NLP resources, and...questions. For languages that use capitalization (e.g. not Chinese or Arabic ) for named entities, IBQA can make use of NE classing (e.g. “Bob Marley
Full Text Available Participatory-action research encourages the involvement of all key stakeholders in the research process and is especially well suited to mental health research. Previous literature outlines the importance of engaging stakeholders in the development of research questions and methodologies, but little has been written about ensuring the involvement of all stakeholders (especially non-academic members in dissemination opportunities such as publication development. The Article Idea Chart was developed as a specific methodology for engaging all stakeholders in data analysis and publication development. It has been successfully utilised in a number of studies and is an effective tool for ensuring the dissemination process of participatory-action research results is both inclusive and transparent to all team members, regardless of stakeholder group. Keywords: participatory-action research, mental health, dissemination, community capacity building, publications, authorship
Torous, John; Kiang, Mathew V; Lorme, Jeanette; Onnela, Jukka-Pekka
A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-quality data. Our aim is not the creation of yet another app per se but rather the establishment of a platform to collect research-quality smartphone raw sensor and usage pattern data. Our ultimate goal is to develop statistical, mathematical, and computational methodology to enable us and others to extract biomedical and clinical insights from smartphone data. We report on the development and early testing of Beiwe, a research platform featuring a study portal, smartphone app, database, and data modeling and analysis tools designed and developed specifically for transparent, customizable, and reproducible biomedical research use, in particular for the study of psychiatric and neurological disorders. We also outline a proposed study using the platform for patients with schizophrenia. We demonstrate the passive data capabilities of the Beiwe platform and early results of its analytical capabilities. Smartphone sensors and phone usage patterns, when coupled with appropriate statistical learning tools, are able to capture various social and behavioral manifestations of illnesses, in naturalistic settings, as lived and experienced by patients. The ubiquity of smartphones makes this type of moment-by-moment quantification of disease phenotypes highly scalable and, when integrated within a transparent research platform, presents tremendous opportunities for research, discovery, and patient health.
Sacramento, Jose Miguel Noronha
This work aims to assist research institutes, notably the IPEN, in order to improve their assertiveness in the process of defining their research lines. New evolutionary speeds have increased exponentially requiring greater synchronism and multiple and coordinated action from the three fundamental elements in order to assure the development of the contemporary society: Government, Productive Structure and Infrastructure in Science and Technology. This environment increasingly dynamic and mutant imposes greater proximity with the socioeconomic environment when former client-consumer has become the co-creator of knowledge and supplier of energy now contained in a new standard of social relations, called Networked Society. The difference in time for the University, the Productive Structure and Government is function of its main activities: Science, Market and the achievement of Public Opinion, respectively. The equation that will harmonize and find synergies between these three dimensions is the contemporary challenge for those who seek to innovate and advance knowledge in order to improve the standard of living of the society. In this work is shown that research institutes must believe in the words of Robert Plomin and start connecting to the several links in different chains in order to make use of a collective intelligence that continuously expands in speed and quality higher than in any other time in human history. The comparison among the results obtained from the different methodologies of analysis proposed in this work allows finding out strengths and weaknesses, threats and opportunities of the IPEN providing subsidies in order to find better ways to tailor its performance to the new demands. (author)
Atkinson, Nancy L; Massett, Holly A; Mylks, Christy; McCormack, Lauren A; Kish-Doto, Julia; Hesse, Bradford W; Wang, Min Qi
Informatics applications have the potential to improve participation in clinical trials, but their design must be based on user-centered research. This research used a fully counterbalanced experimental design to investigate the effect of changes made to the original version of a website, http://BreastCancerTrials.org/, and confirm that the revised version addressed and reinforced patients' needs and expectations. Participants included women who had received a breast cancer diagnosis within the last 5 years (N=77). They were randomized into two groups: one group used and reviewed the original version first followed by the redesigned version, and the other group used and reviewed them in reverse order. The study used both quantitative and qualitative measures. During use, participants' click paths and general reactions were observed. After use, participants were asked to answer survey items and open-ended questions to indicate their reactions and which version they preferred and met their needs and expectations better. Overall, the revised version of the site was preferred and perceived to be clearer, easier to navigate, more trustworthy and credible, and more private and safe overall. However, users who viewed the original version last had similar attitudes toward both versions. By applying research findings to the redesign of a website for clinical trial searching, it was possible to re-engineer the interface to better support patients' decisions to participate in clinical trials. The mechanisms of action in this case appeared to revolve around creating an environment that supported a sense of personal control and decisional autonomy.
Anna Kirova PhD
Full Text Available In this article the authors explore the effect of word-image relationships on the collection of data and the reporting of research results for a study involving the development of a series of fotonovelas with immigrant children in an inner-city school. The central question explored in this article is Can experiences such as producing visual narratives in the form of fotonovelas stimulate multiple expressions of voice and position and bring awareness of embodied ways of communicating in a culture-rich school context? The processes involved in collaboratively developing the photographic narrative format of the fotonovela combine visual elements and structures and embodied, reflective performance together with written text. As a research method fotonovela does not merely translate verbal into visual representations but constructs a hybrid photo-image-text that opens new spaces for dialogue, resistance, and representation of a new way of knowing that changes the way of seeing and has the potential to change the author's and the reader's self-understanding.
Skonieczny, Łukasz; Rybiński, Henryk; Kryszkiewicz, Marzena; Niezgódka, Marek
This book is a selection of results obtained within three years of research performed under SYNAT—a nation-wide scientific project aiming at creating an infrastructure for scientific content storage and sharing for academia, education and open knowledge society in Poland. The book is intended to be the last of the series related to the SYNAT project. The previous books, titled “Intelligent Tools for Building a Scientific Information Platform” and “Intelligent Tools for Building a Scientific Information Platform: Advanced Architectures and Solutions”, were published as volumes 390 and 467 in Springer's Studies in Computational Intelligence. Its contents is based on the SYNAT 2013 Workshop held in Warsaw. The papers included in this volume present an overview and insight into information retrieval, repository systems, text processing, ontology-based systems, text mining, multimedia data processing and advanced software engineering, addressing the problems of implementing intelligent tools for building...
Luo, Zhonghui; Peng, Bin; Xiao, Qijun; Bai, Lu
Thermal error is the main factor affecting the accuracy of precision machining. Through experiments, this paper studies the thermal error test and intelligent modeling for the spindle of vertical high speed CNC machine tools in respect of current research focuses on thermal error of machine tool. Several testing devices for thermal error are designed, of which 7 temperature sensors are used to measure the temperature of machine tool spindle system and 2 displacement sensors are used to detect the thermal error displacement. A thermal error compensation model, which has a good ability in inversion prediction, is established by applying the principal component analysis technology, optimizing the temperature measuring points, extracting the characteristic values closely associated with the thermal error displacement, and using the artificial neural network technology.
Giles-Corti, Billie; Macaulay, Gus; Middleton, Nick; Boruff, Bryan; Bull, Fiona; Butterworth, Iain; Badland, Hannah; Mavoa, Suzanne; Roberts, Rebecca; Christian, Hayley
Growing evidence shows that higher-density, mixed-use, pedestrian-friendly neighbourhoods encourage active transport, including transport-related walking. Despite widespread recognition of the benefits of creating more walkable neighbourhoods, there remains a gap between the rhetoric of the need for walkability and the creation of walkable neighbourhoods. Moreover, there is little objective data to benchmark the walkability of neighbourhoods within and between Australian cities in order to monitor planning and design intervention progress and to assess built environment and urban policy interventions required to achieve increased walkability. This paper describes a demonstration project that aimed to develop, trial and validate a 'Walkability Index Tool' that could be used by policy makers and practitioners to assess the walkability of local areas; or by researchers to access geospatial data assessing walkability. The overall aim of the project was to develop an automated geospatial tool capable of creating walkability indices for neighbourhoods at user-specified scales. The tool is based on open-source software architecture, within the Australian Urban Research Infrastructure Network (AURIN) framework, and incorporates key sub-component spatial measures of walkability (street connectivity, density and land use mix). Using state-based data, we demonstrated it was possible to create an automated walkability index. However, due to the lack of availability of consistent of national data measuring land use mix, at this stage it has not been possible to create a national walkability measure. The next stage of the project is to increase useability of the tool within the AURIN portal and to explore options for alternative spatial data sources that will enable the development of a valid national walkability index. AURIN's open-source Walkability Index Tool is a first step in demonstrating the potential benefit of a tool that could measure walkability across Australia. It
Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D'Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel
International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants' comprehension of the study information was measured by using a validated digitised audio questionnaire. The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants' 'recall' and 'understanding' between first and second visits were statistically significant (F (1,41)=25.38, presearch is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings.
Kinnier, Christine V; Chung, Jeanette W; Bilimoria, Karl Y
While randomized controlled trials (RCTs) are the gold standard for research, many research questions cannot be ethically and practically answered using an RCT. Comparative effectiveness research (CER) techniques are often better suited than RCTs to address the effects of an intervention under routine care conditions, an outcome otherwise known as effectiveness. CER research techniques covered in this section include: effectiveness-oriented experimental studies such as pragmatic trials and cluster randomized trials, treatment response heterogeneity, observational and database studies including adjustment techniques such as sensitivity analysis and propensity score analysis, systematic reviews and meta-analysis, decision analysis, and cost effectiveness analysis. Each section describes the technique and covers the strengths and weaknesses of the approach.
Seltzer, Erica D.; Stolley, Melinda R.; Mensah, Edward K.; Sharp, Lisa K.
Purpose The recent and rapid growth of social networking site (SNS) use presents a unique public health opportunity to develop effective strategies for the recruitment of hard-to-reach participants for cancer research studies. This survey investigated childhood cancer survivors’ reported use of SNS such as facebook or MySpace and their perceptions of using SNS, for recruitment into survivorship research. Methods Sixty White, Black and Hispanic, adult childhood cancer survivors (range 18 – 48 years of age) that were randomly selected from a larger childhood cancer study, the Chicago Healthy Living Study (CHLS), participated in this pilot survey. Telephone surveys were conducted to understand current SNS activity and attitudes towards using SNS as a cancer research recruitment tool. Results Seventy percent of participants reported SNS usage of which 80% were at least weekly users and 79 % reported positive attitudes towards the use of SNS as a recruitment tool for survivorship research. Conclusions and implications for cancer survivors The results of this pilot study revealed that SNS use was high and regular among the childhood cancer survivors sampled. Most had positive attitudes towards using SNS for recruitment of research. The results of this pilot survey suggest that SNS may offer an alternative approach for recruitment of childhood cancer survivors into research. PMID:24532046
Seltzer, Erica D; Stolley, Melinda R; Mensah, Edward K; Sharp, Lisa K
The recent and rapid growth of social networking site (SNS) use presents a unique public health opportunity to develop effective strategies for the recruitment of hard-to-reach participants for cancer research studies. This survey investigated childhood cancer survivors' reported use of SNS such as Facebook or MySpace and their perceptions of using SNS, for recruitment into survivorship research. Sixty White, Black, and Hispanic adult childhood cancer survivors (range 18-48 years of age) that were randomly selected from a larger childhood cancer study, the Chicago Healthy Living Study, participated in this pilot survey. Telephone surveys were conducted to understand current SNS activity and attitudes towards using SNS as a cancer research recruitment tool. Seventy percent of participants reported SNS usage of which 80 % were at least weekly users and 79 % reported positive attitudes towards the use of SNS as a recruitment tool for survivorship research. The results of this pilot study revealed that SNS use was high and regular among the childhood cancer survivors sampled. Most had positive attitudes towards using SNS for recruitment of research. The results of this pilot survey suggest that SNS may offer an alternative approach for recruitment of childhood cancer survivors into research.
Costa, Fabricio F
Advances in information technology have improved our ability to gather, collect and analyze information from individuals online. Social networks can be seen as a nonlinear superposition of a multitude of complex connections between people where the nodes represent individuals and the links between them capture a variety of different social interactions. The emergence of different types of social networks has fostered connections between individuals, thus facilitating data exchange in a variety of fields. Therefore, the question posed now is "can these same tools be applied to life sciences in order to improve scientific and medical research?" In this article, I will review how social networks and other web-based tools are changing the way we approach and track diseases in biomedical research. Copyright © 2012 Elsevier Ltd. All rights reserved.
Martínez Ruiz, María Ángeles; Ávalos Ramos, María Alejandra; Merma Molina, Gladys
The aim of this study is to analyse the metaphorical expressions designed by Science of Sport and Physical Activity university students, as a tool of inquiring two research questions: their perceptions of their physical education teachers, and the meaning physical activity has in students’ personal life. 51 students from the University of Alicante have participated in the study. Qualitative data analysis software AQUAD 6 was used for data processing. The results obtained from the analysis of ...
Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D’Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel
Background International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. Objectives This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. Methods We developed the informed consent document of the malaria treatment trial into a m...
Full Text Available Aim/Purpose: These days educators are expected to integrate technological tools into classes. Although they acquire relevant skills, they are often reluctant to use these tools. Background:\tWe incorporated online forums for generating a Community of Inquiry (CoI in a faculty development program. Extending the Technology, Pedagogy, and Content Knowledge (TPACK model with Assessment Knowledge and content analysis of forum discourse and reflection after each CoI, we offer the Diagnostic Tool for Learning, Assessment, and Research (DTLAR. Methodology: This study spanned over two cycles of a development program for medical faculty. Contribution: This study demonstrates how the DTLAR supports in-depth examination of the benefits and challenges of using CoIs for learning and teaching. Findings: Before the program, participants had little experience with, and were reluctant to use, CoIs in classes. At the program completion, many were willing to adopt CoIs and appreciated this method’s contribution. Both CoIs discourse and reflections included positive attitudes regarding cognitive and teacher awareness categories. However, negative attitudes regarding affective aspects and time-consuming aspects of CoIs were exposed. Participants who experienced facilitating a CoI gained additional insights into its usefulness. Recommendations for Practitioners\t: The DTLAR allows analyzing adaption of online forums for learning and teaching. Recommendation for Researchers: The DTLAR allows analyzing factors that affect the acceptance of online fo-rums for learning and teaching. Impact on Society\t: While the tool was implemented in the context of medical education, it can be readily applied in other adult learning programs. Future Research: The study includes several design aspects that probably affected the improve-ment and challenges we found. Future research is called for providing guidelines for identifying boundary conditions and potential for further
In 1988, the Uranium Institute, a London-based international association of industrial enterprises in the nuclear industry, published a report entitled The Safety of Nuclear Power Plants. Based on an assessment by an international group of senior nuclear experts from eight countries, the report provides an authoritative explanation, for non-specialists of the basic principles of reactor safety, their application, and their implications. Some questions and answers are selected from that report; they address only a few of the subjects that the report itself examines in greater detail
Strasser, Carly; Kunze, John; Abrams, Stephen; Cruse, Patricia
Scientific datasets have immeasurable value, but they lose their value over time without proper documentation, long-term storage, and easy discovery and access. Across disciplines as diverse as astronomy, demography, archeology, and ecology, large numbers of small heterogeneous datasets (i.e., the long tail of data) are especially at risk unless they are properly documented, saved, and shared. One unifying factor for many of these at-risk datasets is that they reside in spreadsheets. In response to this need, the California Digital Library (CDL) partnered with Microsoft Research Connections and the Gordon and Betty Moore Foundation to create the DataUp data management tool for Microsoft Excel. Many researchers creating these small, heterogeneous datasets use Excel at some point in their data collection and analysis workflow, so we were interested in developing a data management tool that fits easily into those work flows and minimizes the learning curve for researchers. The DataUp project began in August 2011. We first formally assessed the needs of researchers by conducting surveys and interviews of our target research groups: earth, environmental, and ecological scientists. We found that, on average, researchers had very poor data management practices, were not aware of data centers or metadata standards, and did not understand the benefits of data management or sharing. Based on our survey results, we composed a list of desirable components and requirements and solicited feedback from the community to prioritize potential features of the DataUp tool. These requirements were then relayed to the software developers, and DataUp was successfully launched in October 2012.
Rosenbaum, Benjamin P; Silkin, Nikolay; Miller, Randolph A
Real-time alerting systems typically warn providers about abnormal laboratory results or medication interactions. For more complex tasks, institutions create site-wide 'data warehouses' to support quality audits and longitudinal research. Sophisticated systems like i2b2 or Stanford's STRIDE utilize data warehouses to identify cohorts for research and quality monitoring. However, substantial resources are required to install and maintain such systems. For more modest goals, an organization desiring merely to identify patients with 'isolation' orders, or to determine patients' eligibility for clinical trials, may adopt a simpler, limited approach based on processing the output of one clinical system, and not a data warehouse. We describe a limited, order-entry-based, real-time 'pick off' tool, utilizing public domain software (PHP, MySQL). Through a web interface the tool assists users in constructing complex order-related queries and auto-generates corresponding database queries that can be executed at recurring intervals. We describe successful application of the tool for research and quality monitoring.
One of the main requirements in agricultural research is to analyse large number of samples for their one or more chemical constituents and physical properties. In plant breeding programmes and germplasm evaluation, it is necessary that the analysis is fast as many samples are to be analysed. Pulsed nuclear magnetic resonance (NMR) is a potential tool for developing rapid and nondestructive method of analysis. Various applications of low resolution pulsed NMR in agricultural research, which are generally used as screening method are briefly described. 25 refs., 2 figs., 2 tabs
The GLARE: Grease Lubrication Apparatus for Research and Education was designed as a fourth year thesis project with the University of Ontario Institute of Technology (UOIT). The purpose of the apparatus is to train Ontario Power Generation Nuclear (OPGN) staff to properly lubricate bearings with grease and to help detect early equipment failures. Proper re-lubrication is critical to the nuclear industry as equipment may be inaccessible for long periods of time. A secondary purpose for the tool is for UOIT research and undergraduate laboratories.This abstract provides an overview of the project and its application to the nuclear industry. (author)
Daim, Tugrul; Kim, Jisun
Technologies such as renewable energy alternatives including wind, solar and biomass, storage technologies and electric engines are creating a different landscape for the electricity industry. Using sources and ideas from technologies such as renewable energy alternatives, Research and Technology Management in the Electricity Industry explores a different landscape for this industry and applies it to the electric industry supported by real industry cases. Divided into three sections, Research and Technology Management in the Electricity Industry introduces a range of methods and tools includ
Effective mentoring is a critical component in the training of early-career researchers, cultivating more independent, productive and satisfied scientists. For example, mentoring has been shown by the 2005 Sigma Xi National Postdoc Survey to be a key indicator for a successful postdoctoral outcome. Mentoring takes many forms and can include support for maximizing research skills and productivity as well as assistance in preparing for a chosen career path. Yet, because there is no "one-size-fits-all” approach, mentoring can be an activity that is hard to define. In this presentation, a series of tips and tools will be offered to aid mentors in developing a plan for their mentoring activities. This will include: suggestions for how to get started; opportunities for mentoring activities within the research group, within the institution, and outside the institution; tools for communicating and assessing professional milestones; and resources for fostering the professional and career development of mentees. Special considerations will also be presented for mentoring international scholars and women. These strategies will be helpful to the PI responding to the new NSF mentoring plan requirement for postdocs as well as to the student, postdoc, researcher or professor overseeing the research and training of others.
Question answering (QA) has become one of the fastest growing topics in computational linguistics and information access. To advance research in the area of dialogue-based question answering, we propose a combination of methods from different scientific fields (i.e., Information Retrieval, Dialogue Systems, Semantic Web, and Machine Learning). This book sheds light on adaptable dialogue-based question answering. We demonstrate the technical and computational feasibility of the proposed ideas, the introspective methods in particular, by beginning with an extensive introduction to the dialogical
A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.
Sarah E. Council
Full Text Available The field of citizen science is exploding and offers not only a great way to engage the general public in science literacy through primary research, but also an avenue for teaching professionals to engage their students in meaningful community research experiences. Though this field is expanding, there are many hurdles for researchers and participants, as well as challenges for teaching professionals who want to engage their students. Here we highlight one of our projects that engaged many citizens in Raleigh, NC, and across the world, and we use this as a case study to highlight ways to engage citizens in all kinds of research. Through the use of numerous tools to engage the public, we gathered citizen scientists to study skin microbes and their associated odors, and we offer valuable ideas for teachers to tap into resources for their own students and potential citizen-science projects.
Collaborative Chat Reference Service Effectiveness Varies by Question Type for Public Library Patrons. A review of: Kwon, Nahyun. ʺPublic Library Patronsʹ Use of Collaborative Chat Reference Service: The Effectiveness of Question Answering by Question Type.ʺ Library & Information Science Research 29.1 (Mar. 2007: 70‐91.
Full Text Available Objective – To assess the effectiveness of a collaborative chat reference service in answering different types of question. Specifically, the study compares the degree of answer completion and the level of user satisfaction for simple factual questions vs. more in‐depth subject‐based reference questions, and for ‘local’ (pertaining to a particular library and non‐local questions.Design – Content analysis of 415 transcripts of reference transactions, which were also compared to corresponding user satisfaction survey results.Setting – An online collaborative reference service offered by a large public library system (33 branch and regional locations. This service is part of the Metropolitan Cooperative Library System: a virtual reference consortium of U.S. libraries (public, academic, special, and corporate that provides 24/7 service.Subjects – Reference librarians from around the U.S. (49 different libraries, and users logging into the service via the public library system’s portal (primarily patrons of the 49 libraries. Method – Content analysis was used to evaluate virtual reference transcripts recorded between January and June, 2004. Reliability was enhanced through triangulation, with researchers comparing the content analysis of each transcript against the results of a voluntary exit survey. Of 1,387 transactions that occurred during the period of study, 420 users completed the survey and these formed the basis of the study, apart from 5 transactions that were omitted because the questions were incomprehensible. Questions were examined and assigned to five categories: “simple, factual questions; subject‐based research questions; resource access questions; circulation‐related questions; and local library information inquiries” (80‐81. Answers were classed as either “completely answered, partially answered or unanswered, referred, and problematic endings” (82. Lastly, user satisfaction was surveyed on three
Juliano Desiderato ANTONIO
Full Text Available The aim of this paper is to describe the rhetorical structure of the argumentative answer genre in a corpus formed by 15 compositions of the winter vestibular of Universidade Estadual de Maringá. The instrument of analysis used in the investigation was RST (Rhetorical Structure Theory. The initial statement was considered the central unit of the argumentative answer. Most of the writers held evidence relation between the central unit (nucleus and the expansion (satellite. Evidence relation is interpersonal and the aim of the writers is to convince their addressees (in this case the compositions evaluation committee that their point is correct. Within the initial statement, the relation with higher frequency was contrast. Our hypothesis is that the selection of texts of the test influenced the applicants to present positive and negative aspects of the internet. In the higher level of the expansion text span, list is the most frequent relation because the applicants present various arguments with the same status. Contrast was the second relation with highest frequency in this same level. Our hypothesis is that the selection of texts of the test influenced the applicants to present positive and negative aspects of the internet as it happened in the initial statement. Within the 15 compositions, 12 had a conclusion. This part was considered a satellite of the span formed by the initial statement and its expansion. The relation held was homonymous.
Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001-2014. A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion ("consultation process") but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face-to-face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. The number of priority setting exercises in health research published in PubMed-indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well-defined structure - such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix - it is likely that the Delphi method and non-replicable consultation processes will gradually be replaced by these emerging tools, which offer more
Background Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. Methods To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001–2014. Results A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion (“consultation process”) but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face–to–face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. Conclusion The number of priority setting exercises in health research published in PubMed–indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well–defined structure – such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix – it is likely that the Delphi method and non–replicable consultation processes will gradually be
Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.
Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software
Wahle, Andreas; Lee, Kyungmoo; Harding, Adam T.; Garvin, Mona K.; Niemeijer, Meindert; Sonka, Milan; Abràmoff, Michael D.
In ophthalmology, various modalities and tests are utilized to obtain vital information on the eye's structure and function. For example, optical coherence tomography (OCT) is utilized to diagnose, screen, and aid treatment of eye diseases like macular degeneration or glaucoma. Such data are complemented by photographic retinal fundus images and functional tests on the visual field. DICOM isn't widely used yet, though, and frequently images are encoded in proprietary formats. The eXtensible Neuroimaging Archive Tool (XNAT) is an open-source NIH-funded framework for research PACS and is in use at the University of Iowa for neurological research applications. Its use for ophthalmology was hence desirable but posed new challenges due to data types thus far not considered and the lack of standardized formats. We developed custom tools for data types not natively recognized by XNAT itself using XNAT's low-level REST API. Vendor-provided tools can be included as necessary to convert proprietary data sets into valid DICOM. Clients can access the data in a standardized format while still retaining the original format if needed by specific analysis tools. With respective project-specific permissions, results like segmentations or quantitative evaluations can be stored as additional resources to previously uploaded datasets. Applications can use our abstract-level Python or C/C++ API to communicate with the XNAT instance. This paper describes concepts and details of the designed upload script templates, which can be customized to the needs of specific projects, and the novel client-side communication API which allows integration into new or existing research applications.
Guertin, L. A.
VoiceThread has been utilized in an undergraduate research methods course for peer review and final research project dissemination. VoiceThread (http://www.voicethread.com) can be considered a social media tool, as it is a web-based technology with the capacity to enable interactive dialogue. VoiceThread is an application that allows a user to place a media collection online containing images, audio, videos, documents, and/or presentations in an interface that facilitates asynchronous communication. Participants in a VoiceThread can be passive viewers of the online content or engaged commenters via text, audio, video, with slide annotations via a doodle tool. The VoiceThread, which runs across browsers and operating systems, can be public or private for viewing and commenting and can be embedded into any website. Although few university students are aware of the VoiceThread platform (only 10% of the students surveyed by Ng (2012)), the 2009 K-12 edition of The Horizon Report (Johnson et al., 2009) lists VoiceThread as a tool to watch because of the opportunities it provides as a collaborative learning environment. In Fall 2011, eleven students enrolled in an undergraduate research methods course at Penn State Brandywine each conducted their own small-scale research project. Upon conclusion of the projects, students were required to create a poster summarizing their work for peer review. To facilitate the peer review process outside of class, each student-created PowerPoint file was placed in a VoiceThread with private access to only the class members and instructor. Each student was assigned to peer review five different student posters (i.e., VoiceThread images) with the audio and doodle tools to comment on formatting, clarity of content, etc. After the peer reviews were complete, the students were allowed to edit their PowerPoint poster files for a new VoiceThread. In the new VoiceThread, students were required to video record themselves describing their research
M. M. Aligadjiev
Full Text Available Aim. The paper discusses the improvement of methods of hydrobiological studies by modifying tools for plankton and benthic samples collecting. Methods. In order to improve the standard methods of hydro-biological research, we have developed tools for sampling zooplankton and benthic environment of the Caspian Sea. Results. Long-term practice of selecting hydrobiological samples in the Caspian Sea shows that it is required to complete the modernization of the sampling tools used to collect hydrobiological material. With the introduction of Azov and Black Sea invasive comb jelly named Mnemiopsis leidyi A. Agassiz to the Caspian Sea there is a need to collect plankton samples without disturbing its integrity. Tools for collecting benthic fauna do not always give a complete picture of the state of benthic ecosystems because of the lack of visual site selection for sampling. Moreover, while sampling by dredge there is a probable loss of the samples, especially in areas with difficult terrain. Conclusion. We propose to modify a small model of Upstein net (applied in shallow water to collect zooplankton samples with an upper inverted cone that will significantly improve the catchability of the net in theCaspian Sea. Bottom sampler can be improved by installing a video camera for visual inspection of the bottom topography, and use sensors to determine tilt of the dredge and the position of the valves of the bucket.
Full Text Available Assembly is the part that produces the maximum workload and consumed time during product design and manufacturing process. CNC machine tool is the key basic equipment in manufacturing industry and research on assembly design technologies of CNC machine tool has theoretical significance and practical value. This study established a simplified ASRG for CNC machine tool. The connection between parts, semantic information of transmission, and geometric constraint information were quantified to assembly connection strength to depict the assembling difficulty level. The transmissibility based on trust relationship was applied on the assembly connection strength. Assembly unit partition based on assembly connection strength was conducted, and interferential assembly units were identified and revised. The assembly sequence planning and optimization of parts in each assembly unit and between assembly units was conducted using genetic algorithm. With certain type of high speed CNC turning center, as an example, this paper explored into the assembly modeling, assembly unit partition, and assembly sequence planning and optimization and realized the optimized assembly sequence of headstock of CNC machine tool.
Full Text Available The multi-disciplinary and international nature of large European projects requires powerful managerial and communicative tools to ensure the transmission of information to the end-users. One such project is TRACE entitled “Tracing Food Commodities in Europe”. One of its objectives is to provide a communication system dedicated to be the central source of information on food authenticity and traceability in Europe. This paper explores the web tools used and communication vehicles offered to scientists involved in the TRACE project to communicate internally as well as to the public. Two main tools have been built: an Intranet and a public website. The TRACE website can be accessed at http://www.trace.eu.org. A particular emphasis was placed on the efficiency, the relevance and the accessibility of the information, the publicity of the website as well as the use of the collaborative utilities. The rationale of web space design as well as integration of proprietary software solutions are presented. Perspectives on the using of web tools in the research projects are discussed.
Argyropoulou, Eleftheria; Hatira, Kalliopi
This article introduces an alternative qualitative research tool: metaphor and drawing, as projections of personality features, to explore underlying concepts and values, thoughts and beliefs, fears and hesitations, aspirations and ambitions of the research subjects. These two projective tools are used to explore Greek state kindergarten head…
Rodney R. Dietert
Full Text Available Academic preparation of science researchers and/or human or veterinary medicine clinicians through the science, technology, engineering, and mathematics (STEM curriculum has usually focused on the students (1 acquiring increased disciplinary expertise, (2 learning needed methodologies and protocols, and (3 expanding their capacity for intense, persistent focus. Such educational training is effective until roadblocks or problems arise via this highly-learned approach. Then, the health science trainee may have few tools available for effective problem solving. Training to achieve flexibility, adaptability, and broadened perspectives using contemplative practices has been rare among biomedical education programs. To address this gap, a Cornell University-based program involving formal biomedical science coursework, and health science workshops has been developed to offer science students, researchers and health professionals a broader array of personal, contemplation-based, problem-solving tools. This STEM educational initiative includes first-person exercises designed to broaden perceptional awareness, decrease emotional drama, and mobilize whole-body strategies for creative problem solving. Self-calibration and journaling are used for students to evaluate the personal utility of each exercise. The educational goals are to increase student self-awareness and self-regulation and to provide trainees with value-added tools for career-long problem solving. Basic elements of this educational initiative are discussed using the framework of the Tree of Contemplative Practices.
Full Text Available The paper describes the research and development of casting and solidification of slab ingots from special tool steels by means of numerical modelling using the finite element method. The pre-processing, processing and post-processing phases of numerical modelling are outlined. Also, problems with determining the thermophysical properties of materials and heat transfer between the individual parts of the casting system are discussed. Based on the type of grade of tool steel, the risk of final porosity is predicted. The results allowed to improve the production technology of slab ingots, and also to verify the ratio, the chamfer and the external/ internal shape of the wall of the new designed slab ingots.
We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Over the last 60 years research reactors (RRs) have played an important role in technological and socio-economical development of mankind, such as radioisotope production for medicine, industry, research and education. Neutron scattering has been widely used for research and development in materials science. The prospect of neutron scattering as a powerful tool for materials research is increasing in the 21 st century. This can be seen from the investment of several new neutron sources all over the world such as the Spallation Neutron Source (SNS) in USA, the Japan Proton Accelerator Complex (JPARC) in Japan, the new OPAL Reactor in Australia, and some upgrading to the existing sources at ISIS, Rutherford Appleton Laboratory, UK; Institute of Laue Langevin (ILL) in Grenoble, France and Berlin Reactor, Germany. Developing countries with moderate flux research reactor have also been involved in this technique, such as India, Malaysia and Indonesia The Siwabessy Multipurpose Reactor in Serpong, Indonesia that also produces thermal neutron has contributed to the research and development in the Asia Pacific Region. However,the international joint research among those countries plays an important role on optimizing the results. (author)
Full Text Available Over the last 60 years research reactors (RRs have played an important role in technological and socio-economical development of mankind, such as radioisotope production for medicine, industry, research and education. Neutron scattering has been widely used for research and development in materials science. The prospect of neutron scattering as a powerful tool for materials research is increasing in the 21st century. This can be seen from the investment of several new neutron sources all over the world such as the Spallation Neutron Source (SNS in USA, the Japan Proton Accelerator Complex (JPARC in Japan, the new OPAL Reactor in Australia, and some upgrading to the existing sources at ISIS, Rutherford Appleton Laboratory, UK; Institute of Laue Langevin (ILL in Grenoble, France and Berlin Reactor, Germany. Developing countries with moderate flux research reactor have also been involved in this technique, such as India, Malaysia and Indonesia. The Siwabessy Multipurpose Reactor in Serpong, Indonesia that also produces thermal neutron has contributed to the research and development in the Asia Pacific Region. However, the international joint research among those countries plays an important role on optimizing the results.
Afolabi, Muhammed Olanrewaju; McGrath, Nuala; D'Alessandro, Umberto; Kampmann, Beate; Imoukhuede, Egeruan B; Ravinetto, Raffaella M; Alexander, Neal; Larson, Heidi J; Chandramohan, Daniel; Bojang, Kalifa
To assess the effectiveness of a multimedia informed consent tool for adults participating in a clinical trial in the Gambia. Adults eligible for inclusion in a malaria treatment trial (n = 311) were randomized to receive information needed for informed consent using either a multimedia tool (intervention arm) or a standard procedure (control arm). A computerized, audio questionnaire was used to assess participants' comprehension of informed consent. This was done immediately after consent had been obtained (at day 0) and at subsequent follow-up visits (days 7, 14, 21 and 28). The acceptability and ease of use of the multimedia tool were assessed in focus groups. On day 0, the median comprehension score in the intervention arm was 64% compared with 40% in the control arm (P = 0.042). The difference remained significant at all follow-up visits. Poorer comprehension was independently associated with female sex (odds ratio, OR: 0.29; 95% confidence interval, CI: 0.12-0.70) and residing in Jahaly rather than Basse province (OR: 0.33; 95% CI: 0.13-0.82). There was no significant independent association with educational level. The risk that a participant's comprehension score would drop to half of the initial value was lower in the intervention arm (hazard ratio 0.22, 95% CI: 0.16-0.31). Overall, 70% (42/60) of focus group participants from the intervention arm found the multimedia tool clear and easy to understand. A multimedia informed consent tool significantly improved comprehension and retention of consent information by research participants with low levels of literacy.
Full Text Available Online social spaces, where users can exchange information, opinions and resources, have achieved wide popularity and are gaining attention in many research fields, including education. Their actual potential support to learning, however, still requires investigation, especially because portals can widely differ as concerns purpose and internal structure. This paper aims to contribute in this respect, by concentrating on question answering, a kind of social space not yet widely discussed in education. We analyzed a small corpus of posts from the Languages section of Yahoo! Answers Italy, checking if the questions reveal some inclination to learning or just the desire to obtain a service and if the answers provided by the community members can be considered as reliable sources of knowledge. Our analysis highlights the presence of a variety of question/answer types, from mere information exchange or help for task completion, up to language-related questions prompting valuable short lessons. The quality of answers may widely vary as concerns pertinence, correctness and richness of supporting elements. We found a high number of purely task-oriented questions and answers, but also a higher number of learning-oriented questions and correct, informative answers. This suggests that this kind of social space actually has valuable potential for informal learning.
Full Text Available The UbuntuNet Alliance Alliance is well-placed to facilitate interaction between education and research institutions and the African academic and researcher in the Diaspora so that together they can strengthen research that will exploit new technological tools and increase the industrial base. It is envisaged that the Alliance will become an important vehicle for linkages that will facilitate repatriation of scientific knowledge and skills to Africa and even help reduce and eventually eradicate the brain drain which has taken so many excellent intellectuals to the developed world. As organisational vehicles for inter-institutional collaboration both established and emerging NRENs can play a critical role in reversing these trends and in mitigating what appears to be the negative impact of the brain drain.
Ormand, C. J.; Shipley, T. F.; Dutrow, B. L.; Goodwin, L. B.; Hickson, T. A.; Tikoff, B.; Atit, K.; Gagnier, K. M.; Resnick, I.
Spatial visualization is an essential skill in the STEM disciplines, including the geological sciences. Undergraduate students, including geoscience majors in upper-level courses, bring a wide range of spatial skill levels to the classroom. Students with weak spatial skills may struggle to understand fundamental concepts and to solve geological problems with a spatial component. However, spatial thinking skills are malleable. Using strategies that have emerged from cognitive science research, we developed a set of curricular materials that improve undergraduate geology majors' abilities to reason about 3D concepts and to solve spatially complex geological problems. Cognitive science research on spatial thinking demonstrates that predictive sketching, making visual comparisons, gesturing, and the use of analogy can be used to develop students' spatial thinking skills. We conducted a three-year study of the efficacy of these strategies in strengthening the spatial skills of students in core geology courses at three universities. Our methodology is a quasi-experimental quantitative design, utilizing pre- and post-tests of spatial thinking skills, assessments of spatial problem-solving skills, and a control group comprised of students not exposed to our new curricular materials. Students taught using the new curricular materials show improvement in spatial thinking skills. Further analysis of our data, to be completed prior to AGU, will answer additional questions about the relationship between spatial skills and academic performance, spatial skills and gender, spatial skills and confidence, and the impact of our curricular materials on students who are struggling academically. Teaching spatial thinking in the context of discipline-based exercises has the potential to transform undergraduate education in the geological sciences by removing one significant barrier to success.
Darling, John A.; Frederick, Raymond M.
Understanding the risks of biological invasion posed by ballast water-whether in the context of compliance testing, routine monitoring, or basic research-is fundamentally an exercise in biodiversity assessment, and as such should take advantage of the best tools available for tackling that problem. The past several decades have seen growing application of genetic methods for the study of biodiversity, driven in large part by dramatic technological advances in nucleic acids analysis. Monitoring approaches based on such methods have the potential to increase dramatically sampling throughput for biodiversity assessments, and to improve on the sensitivity, specificity, and taxonomic accuracy of traditional approaches. The application of targeted detection tools (largely focused on PCR but increasingly incorporating novel probe-based methodologies) has led to a paradigm shift in rare species monitoring, and such tools have already been applied for early detection in the context of ballast water surveillance. Rapid improvements in community profiling approaches based on high throughput sequencing (HTS) could similarly impact broader efforts to catalogue biodiversity present in ballast tanks, and could provide novel opportunities to better understand the risks of biotic exchange posed by ballast water transport-and the effectiveness of attempts to mitigate those risks. These various approaches still face considerable challenges to effective implementation, depending on particular management or research needs. Compliance testing, for instance, remains dependent on accurate quantification of viable target organisms; while tools based on RNA detection show promise in this context, the demands of such testing require considerable additional investment in methods development. In general surveillance and research contexts, both targeted and community-based approaches are still limited by various factors: quantification remains a challenge (especially for taxa in larger size
Hoekstra, A.H.; Hiemstra, Djoerd; van der Vet, P.E.; Huibers, Theo W.C.; Schobbens, Pierre-Yves; Vanhoof, Wim; Schwanen, Gabriel
When people pose questions in natural language to search for information on the web, the role of question answering (QA) systems becomes important. In this paper the QAsystem simpleQA, capable of answering Dutch questions on which the answer is a person or a location, is described. The system's
... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Answer; default. 10.64 Section 10.64... SERVICE Rules Applicable to Disciplinary Proceedings § 10.64 Answer; default. (a) Filing. The respondent's... need be adduced at a hearing. (d) Default. Failure to file an answer within the time prescribed (or...
Brøndsted, Tom; Larsen, Henrik Legind; Larsen, Lars Bo
window focused over the part which most likely contains an answer to the query. The two systems are integrated into a full spoken query answering system. The prototype can answer queries and questions within the chosen football (soccer) test domain, but the system has the flexibility for being ported...
... 6 Domestic Security 1 2010-01-01 2010-01-01 false Answer. 13.9 Section 13.9 Domestic Security DEPARTMENT OF HOMELAND SECURITY, OFFICE OF THE SECRETARY PROGRAM FRAUD CIVIL REMEDIES § 13.9 Answer. (a) The Defendant may request a hearing by serving an answer on the Reviewing Official within 30 days of service of...
Sade, Christian; de Barros, Leticia Maria Renault; Melo, Jorge José Maciel; Passos, Eduardo
This paper seeks to assess a way of conducting interviews in line with the ideology of Brazilian Psychiatric Reform. In the methodology of participative intervention and research in mental health, the interview is less a data collection than a data harvesting procedure. It is designed to apply the principles of psychosocial care, autonomy as the basis for treatment, the predominance of the users and of their social networks and civic participation. Inspired by the Explicitation Interview technique, the contention is that the handling of the interview presupposes an open attitude able to promote and embrace different viewpoints. This attitude makes the interview a collective experience of sharing and belonging, allowing participants to reposition themselves subjectively in treatment with the emergence of groupality. As an example of using the interview as a methodological tool in mental health research, we examine research into adaptation of the tool of Autonomous Medication Management (GAM). It is an interventionist approach guided by principles that foster autonomy and the protagonist status of users of psychotropic medication, their quality of life, their rights and recognition of the multiple significances of medication, understood here as a collective interview technique.
Maimon, Eric; Samuni, Uri; Goldstein, Sara
Radicals are part of the chemistry of life, and ionizing radiation chemistry serves as an indispensable research tool for elucidation of the mechanism(s) underlying their reactions. The ever-increasing understanding of their involvement in diverse physiological and pathological processes has expanded the search for compounds that can diminish radical-induced damage. This review surveys the areas of research focusing on radical reactions and particularly with stable cyclic nitroxide radicals, which demonstrate unique antioxidative activities. Unlike common antioxidants that are progressively depleted under oxidative stress and yield secondary radicals, nitroxides are efficient radical scavengers yielding in most cases their respective oxoammonium cations, which are readily reduced back in the tissue to the nitroxide thus continuously being recycled. Nitroxides, which not only protect enzymes, cells, and laboratory animals from diverse kinds of biological injury, but also modify the catalytic activity of heme enzymes, could be utilized in chemical and biological systems serving as a research tool for elucidating mechanisms underlying complex chemical and biochemical processes.
Horvath, Monica M; Winfield, Stephanie; Evans, Steve; Slopek, Steve; Shang, Howard; Ferranti, Jeffrey
In many healthcare organizations, comparative effectiveness research and quality improvement (QI) investigations are hampered by a lack of access to data created as a byproduct of patient care. Data collection often hinges upon either manual chart review or ad hoc requests to technical experts who support legacy clinical systems. In order to facilitate this needed capacity for data exploration at our institution (Duke University Health System), we have designed and deployed a robust Web application for cohort identification and data extraction--the Duke Enterprise Data Unified Content Explorer (DEDUCE). DEDUCE is envisioned as a simple, web-based environment that allows investigators access to administrative, financial, and clinical information generated during patient care. By using business intelligence tools to create a view into Duke Medicine's enterprise data warehouse, DEDUCE provides a Guided Query functionality using a wizard-like interface that lets users filter through millions of clinical records, explore aggregate reports, and, export extracts. Researchers and QI specialists can obtain detailed patient- and observation-level extracts without needing to understand structured query language or the underlying database model. Developers designing such tools must devote sufficient training and develop application safeguards to ensure that patient-centered clinical researchers understand when observation-level extracts should be used. This may mitigate the risk of data being misunderstood and consequently used in an improper fashion. Copyright © 2010 Elsevier Inc. All rights reserved.
Using a question and answer format we describe important aspects of using genomic technologies in cancer research. The main challenges are not managing the mass of data, but rather the design, analysis and accurate reporting of studies that result in increased biological knowledge and medical utility. Many analysis issues address the use of expression microarrays but are also applicable to other whole genome assays. Microarray based clinical investigations have generated both unrealistic hyperbole and excessive skepticism. Genomic technologies are tremendously powerful and will play instrumental roles in elucidating the mechanisms of oncogenesis and in devlopingan era of predictive medicine in which treatments are tailored to individual tumors. Achieving these goals involves challenges in re-thinking many paradigms for the conduct of basic and clinical cancer research and for the organization of interdisciplinary collaboration. PMID:18582627
Tomás, Concepción; Yago, Teresa; Eguiluz, Mercedes; Samitier, M A Luisa; Oliveros, Teresa; Palacios, Gemma
To validate the questionnaire "Gender Perspective in Health Research" (GPIHR) to assess the inclusion of gender perspective in research projects. Validation study in two stages. Feasibility was analysed in the first, and reliability, internal consistence and validity in the second. Aragón Institute of Health Science, Aragón, Spain. GPIHR was applied to 118 research projects funded in national and international competitive tenders from 2003 to 2012. Analysis of inter- and intra-observer reliability with Kappa index and internal consistency with Cronbach's alpha. Content validity analysed through literature review and construct validity with an exploratory factor analysis. Validated GPIHR has 10 questions: 3 in the introduction, 1 for objectives, 3 for methodology and 3 for research purpose. Average time of application was 13min Inter-observer reliability (Kappa) varied between 0.35 and 0.94 and intra-observer between 0.40 and 0.94. Theoretical construct is supported in the literature. Factor analysis identifies three levels of GP inclusion: "difference by sex", "gender sensitive" and "feminist research" with an internal consistency of 0.64, 0.87 and 0.81, respectively, which explain 74.78% of variance. GPIHR questionnaire is a valid tool to assess GP and useful for those researchers who would like to include GP in their projects. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.
Lienert, Florian; Lohmueller, Jason J; Garg, Abhishek; Silver, Pamela A
Recent progress in DNA manipulation and gene circuit engineering has greatly improved our ability to programme and probe mammalian cell behaviour. These advances have led to a new generation of synthetic biology research tools and potential therapeutic applications. Programmable DNA-binding domains and RNA regulators are leading to unprecedented control of gene expression and elucidation of gene function. Rebuilding complex biological circuits such as T cell receptor signalling in isolation from their natural context has deepened our understanding of network motifs and signalling pathways. Synthetic biology is also leading to innovative therapeutic interventions based on cell-based therapies, protein drugs, vaccines and gene therapies. PMID:24434884
Combination of bioaffinity and chromatography gave birth to affinity chromatography. A further combination with frontal analysis resulted in creation of frontal affinity chromatography (FAC). This new versatile research tool enabled detailed analysis of weak interactions that play essential roles in living systems, especially those between complex saccharides and saccharide-binding proteins. FAC now becomes the best method for the investigation of saccharide-binding proteins (lectins) from viewpoints of sensitivity, accuracy, and efficiency, and is contributing greatly to the development of glycobiology. It opened a door leading to deeper understanding of the significance of saccharide recognition in life. The theory is also concisely described. PMID:25169774
Park, Sinyoung; Nam, Chung Mo; Park, Sejung; Noh, Yang Hee; Ahn, Cho Rong; Yu, Wan Sun; Kim, Bo Kyung; Kim, Seung Min; Kim, Jin Seok; Rha, Sun Young
With the growing amount of clinical research, regulations and research ethics are becoming more stringent. This trend introduces a need for quality assurance measures for ensuring adherence to research ethics and human research protection beyond Institutional Review Board approval. Audits, one of the most effective tools for assessing quality assurance, are measures used to evaluate Good Clinical Practice (GCP) and protocol compliance in clinical research. However, they are laborious, time consuming, and require expertise. Therefore, we developed a simple auditing process (a screening audit) and evaluated its feasibility and effectiveness. The screening audit was developed using a routine audit checklist based on the Severance Hospital's Human Research Protection Program policies and procedures. The measure includes 20 questions, and results are summarized in five categories of audit findings. We analyzed 462 studies that were reviewed by the Severance Hospital Human Research Protection Center between 2013 and 2017. We retrospectively analyzed research characteristics, reply rate, audit findings, associated factors and post-screening audit compliance, etc. RESULTS: Investigator reply rates gradually increased, except for the first year (73% → 26% → 53% → 49% → 55%). The studies were graded as "critical," "major," "minor," and "not a finding" (11.9, 39.0, 42.9, and 6.3%, respectively), based on findings and number of deficiencies. The auditors' decisions showed fair agreement with weighted kappa values of 0.316, 0.339, and 0.373. Low-risk level studies, single center studies, and non-phase clinical research showed more prevalent frequencies of being "major" or "critical" (p = 0.002, audit grade (p audit results of post-screening audit compliance checks in "non-responding" and "critical" studies upon applying the screening audit. Our screening audit is a simple and effective way to assess overall GCP compliance by institutions and to
Full Text Available This article describes the main features and implementation of our automatic data distribution research tool. The tool (DDT accepts programs written in Fortran 77 and generates High Performance Fortran (HPF directives to map arrays onto the memories of the processors and parallelize loops, and executable statements to remap these arrays. DDT works by identifying a set of computational phases (procedures and loops. The algorithm builds a search space of candidate solutions for these phases which is explored looking for the combination that minimizes the overall cost; this cost includes data movement cost and computation cost. The movement cost reflects the cost of accessing remote data during the execution of a phase and the remapping costs that have to be paid in order to execute the phase with the selected mapping. The computation cost includes the cost of executing a phase in parallel according to the selected mapping and the owner computes rule. The tool supports interprocedural analysis and uses control flow information to identify how phases are sequenced during the execution of the application.
Williams, Bradley S; D'Amico, Ellen; Kastens, Jude H; Thorp, James H; Flotemersch, Joseph E; Thoms, Martin C
River systems consist of hydrogeomorphic patches (HPs) that emerge at multiple spatiotemporal scales. Functional process zones (FPZs) are HPs that exist at the river valley scale and are important strata for framing whole-watershed research questions and management plans. Hierarchical classification procedures aid in HP identification by grouping sections of river based on their hydrogeomorphic character; however, collecting data required for such procedures with field-based methods is often impractical. We developed a set of GIS-based tools that facilitate rapid, low cost riverine landscape characterization and FPZ classification. Our tools, termed RESonate, consist of a custom toolbox designed for ESRI ArcGIS®. RESonate automatically extracts 13 hydrogeomorphic variables from readily available geospatial datasets and datasets derived from modeling procedures. An advanced 2D flood model, FLDPLN, designed for MATLAB® is used to determine valley morphology by systematically flooding river networks. When used in conjunction with other modeling procedures, RESonate and FLDPLN can assess the character of large river networks quickly and at very low costs. Here we describe tool and model functions in addition to their benefits, limitations, and applications.
Confocal microscopy is widely used in neurobiology for studying the three-dimensional structure of the nervous system. Confocal image data are often multi-channel, with each channel resulting from a different fluorescent dye or fluorescent protein; one channel may have dense data, while another has sparse; and there are often structures at several spatial scales: subneuronal domains, neurons, and large groups of neurons (brain regions). Even qualitative analysis can therefore require visualization using techniques and parameters fine-tuned to a particular dataset. Despite the plethora of volume rendering techniques that have been available for many years, the techniques standardly used in neurobiological research are somewhat rudimentary, such as looking at image slices or maximal intensity projections. Thus there is a real demand from neurobiologists, and biologists in general, for a flexible visualization tool that allows interactive visualization of multi-channel confocal data, with rapid fine-tuning of parameters to reveal the three-dimensional relationships of structures of interest. Together with neurobiologists, we have designed such a tool, choosing visualization methods to suit the characteristics of confocal data and a typical biologist\\'s workflow. We use interactive volume rendering with intuitive settings for multidimensional transfer functions, multiple render modes and multi-views for multi-channel volume data, and embedding of polygon data into volume data for rendering and editing. As an example, we apply this tool to visualize confocal microscopy datasets of the developing zebrafish visual system.
Borges, T.; Stafford, R.S.; Lu, P.Y.; Carter, D.
NUREG/CR-6204 is a collection of questions and answers that were originally issued in seven sets and which pertain to revised 10 CFR Part 20. The questions came from both outside and within the NRC. The answers were compiled and provided by NRC staff within the offices of Nuclear Reactor Regulation, Nuclear Material Safety and Safeguards, Nuclear Regulatory Research, the Office of State Programs, and the five regional offices. Although all of the questions and answers have been reviewed by attorneys in the NRC Office of the General Counsel, they do not constitute official legal interpretations relevant to revised 10 CFR Part 20. The questions and answers do, however, reflect NRC staff decisions and technical options on aspects of the revised 10 CFR Part 20 regulatory requirements. This NUREG is being made available to encourage communication among the public, industry, and NRC staff concerning the major revisions of the NRC's standards for protection against radiation
Lee Peter A
Full Text Available Abstract Background Noonan syndrome (NS is a genetic disorder characterized by phenotypic features, including facial dysmorphology, cardiovascular anomalies, and short stature. Growth hormone (GH has been approved by the United States Food and Drug Administration for short stature in children with NS. The objective of this analysis was to assess the height standard deviation score (HSDS and change in HSDS (ΔHSDS for up to 4 years (Y4 of GH therapy in children with NS. Methods The American Norditropin Studies: Web-Enabled Research (ANSWER Program®, a US-based registry, collects long-term efficacy and safety information on patients treated with Norditropin® (somatropin rDNA origin, Novo Nordisk A/S at the discretion of participating physicians. A total of 120 children (90 boys, 30 girls with NS, naïve to previous GH treatment, were included in this analysis. Results The mean (SD baseline age of subjects (n = 120 was 9.2 (3.8 years. Mean (SD HSDS increased from –2.65 (0.73 at baseline to –1.32 (1.11 at Y4 (n = 17. Subjects showed continued increase in HSDS from baseline to Y4 without significant differences between genders at Y1 or Y2. The mean (SD GH dose was 47 (11 mcg/kg/day at baseline and 59 (16 mcg/kg/day at Y4. There was a negative correlation between baseline age and ΔHSDS at Y1 (R = –0.3156; P = 0.0055 and Y2 (R = –0.3394; P = 0.017. ΔHSDS at Y1 was significantly correlated with ΔHSDS at Y2 (n = 37; R = 0.8527, P Conclusions GH treatment-naïve patients with NS showed continued increases in HSDS during 4 years of treatment with GH with no significant differences between genders up to 2 years. Baseline age was negatively correlated with ΔHSDS at Y1 and Y2. Whether long-term therapy in NS results in continued increase in HSDS to adult height remains to be investigated. Trial registration ClinicalTrials.gov NCT01009905
Jan 19, 2015 ... One research assistant was available to assist the learners and to answer questions while they completed the questionnaires during a classroom period. ..... PubMed | Google Scholar. 4. Hall PA, Holmqvist M, Sherry SB. Risky adolescent sexual behaviour: A psychological perspective for primary care.
Background Currently over 50% of drugs prescribed to children have not been evaluated properly for use in their age group. One key reason why children have been excluded from clinical trials is that they are not considered able to exercise meaningful autonomy over the decision to participate. Dutch law states that competence to consent can be presumed present at the age of 12 and above; however, in pediatric practice children’s competence is not that clearly presented and the transition from assent to active consent is gradual. A gold standard for competence assessment in children does not exist. In this article we describe a study protocol on the development of a standardized tool for assessing competence to consent in research in children and adolescents. Methods/design In this study we modified the MacCAT-CR, the best evaluated competence assessment tool for adults, for use in children and adolescents. We will administer the tool prospectively to a cohort of pediatric patients from 6 to18 years during the selection stages of ongoing clinical trials. The outcomes of the MacCAT-CR interviews will be compared to a reference standard, established by the judgments of clinical investigators, and an expert panel consisting of child psychiatrists, child psychologists and medical ethicists. The reliability, criterion-related validity and reproducibility of the tool will be determined. As MacCAT-CR is a multi-item scale consisting of 13 items, power was justified at 130–190 subjects, providing a minimum of 10–15 observations per item. MacCAT-CR outcomes will be correlated with age, life experience, IQ, ethnicity, socio-economic status and competence judgment of the parent(s). It is anticipated that 160 participants will be recruited over 2 years to complete enrollment. Discussion A validity study on an assessment tool of competence to consent is strongly needed in research practice, particularly in the child and adolescent population. In this study we will establish
Hein Irma M
Full Text Available Abstract Background Currently over 50% of drugs prescribed to children have not been evaluated properly for use in their age group. One key reason why children have been excluded from clinical trials is that they are not considered able to exercise meaningful autonomy over the decision to participate. Dutch law states that competence to consent can be presumed present at the age of 12 and above; however, in pediatric practice children’s competence is not that clearly presented and the transition from assent to active consent is gradual. A gold standard for competence assessment in children does not exist. In this article we describe a study protocol on the development of a standardized tool for assessing competence to consent in research in children and adolescents. Methods/design In this study we modified the MacCAT-CR, the best evaluated competence assessment tool for adults, for use in children and adolescents. We will administer the tool prospectively to a cohort of pediatric patients from 6 to18 years during the selection stages of ongoing clinical trials. The outcomes of the MacCAT-CR interviews will be compared to a reference standard, established by the judgments of clinical investigators, and an expert panel consisting of child psychiatrists, child psychologists and medical ethicists. The reliability, criterion-related validity and reproducibility of the tool will be determined. As MacCAT-CR is a multi-item scale consisting of 13 items, power was justified at 130–190 subjects, providing a minimum of 10–15 observations per item. MacCAT-CR outcomes will be correlated with age, life experience, IQ, ethnicity, socio-economic status and competence judgment of the parent(s. It is anticipated that 160 participants will be recruited over 2 years to complete enrollment. Discussion A validity study on an assessment tool of competence to consent is strongly needed in research practice, particularly in the child and adolescent population. In
Hege, Inga; Kononowicz, Andrzej A; Adler, Martin
Clinical reasoning is a fundamental process medical students have to learn during and after medical school. Virtual patients (VP) are a technology-enhanced learning method to teach clinical reasoning. However, VP systems do not exploit their full potential concerning the clinical reasoning process; for example, most systems focus on the outcome and less on the process of clinical reasoning. Keeping our concept grounded in a former qualitative study, we aimed to design and implement a tool to enhance VPs with activities and feedback, which specifically foster the acquisition of clinical reasoning skills. We designed the tool by translating elements of a conceptual clinical reasoning learning framework into software requirements. The resulting clinical reasoning tool enables learners to build their patient's illness script as a concept map when they are working on a VP scenario. The student's map is compared with the experts' reasoning at each stage of the VP, which is technically enabled by using Medical Subject Headings, which is a comprehensive controlled vocabulary published by the US National Library of Medicine. The tool is implemented using Web technologies, has an open architecture that enables its integration into various systems through an open application program interface, and is available under a Massachusetts Institute of Technology license. We conducted usability tests following a think-aloud protocol and a pilot field study with maps created by 64 medical students. The results show that learners interact with the tool but create less nodes and connections in the concept map than an expert. Further research and usability tests are required to analyze the reasons. The presented tool is a versatile, systematically developed software component that specifically supports the clinical reasoning skills acquisition. It can be plugged into VP systems or used as stand-alone software in other teaching scenarios. The modular design allows an extension with new
ADENIYI AKINGBADE WAIDI
Full Text Available Questionnaire has to do with questions designed to gather information or data for analysis. Questionnaire has to be adequate, simple, focused and related to the subject which the research is set to achieve and to test the hypotheses and questions that are formulated for the study. But many questionnaires are constructed and administered without following proper guideline which hinders there end result. This paper assesses some of the guides for constructing questionnaire as well as it uses and the extent to which it enhanced manager’s access to reliable data and information. Descriptive method is employed for the study. Findings revealed that poor or badly prepared questionnaire produce questionnaire that does not provide effective results. Managers and researchers that use such questionnaire hardly achieve their organisational and research objectives. The need for good, well prepared and adequate questionnaire is exemplified by its being the primary tool for analytical research. The study recommends that questionnaire be properly prepared for effective research outcome.
Pritchett, Amy R.
While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plugin' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).
Kopton, Isabella M.; Kenning, Peter
Over the last decade, the application of neuroscience to economic research has gained in importance and the number of neuroeconomic studies has grown extensively. The most common method for these investigations is fMRI. However, fMRI has limitations (particularly concerning situational factors) that should be countered with other methods. This review elaborates on the use of functional Near-Infrared Spectroscopy (fNIRS) as a new and promising tool for investigating economic decision making both in field experiments and outside the laboratory. We describe results of studies investigating the reliability of prototype NIRS studies, as well as detailing experiments using conventional and stationary fNIRS devices to analyze this potential. This review article shows that further research using mobile fNIRS for studies on economic decision making outside the laboratory could be a fruitful avenue helping to develop the potential of a new method for field experiments outside the laboratory. PMID:25147517
At last, the first systematic guide to the growing jungle of citation indices and other bibliometric indicators. Written with the aim of providing a complete and unbiased overview of all available statistical measures for scientific productivity, the core of this reference is an alphabetical dictionary of indices and other algorithms used to evaluate the importance and impact of researchers and their institutions. In 150 major articles, the authors describe all indices in strictly mathematical terms without passing judgement on their relative merit. From widely used measures, such as the journal impact factor or the h-index, to highly specialized indices, all indicators currently in use in the sciences and humanities are described, and their application explained. The introductory section and the appendix contain a wealth of valuable supporting information on data sources, tools and techniques for bibliometric and scientometric analysis - for individual researchers as well as their funders and publishers.
Gelinas, Luke; Pierce, Robin; Winkler, Sabune; Cohen, I Glenn; Lynch, Holly Fernandez; Bierer, Barbara E
Convertino, V. A.
Lower body negative pressure (LBNP) has been extensively used for decades in aerospace physiological research as a tool to investigate cardiovascular mechanisms that are associated with or underlie performance in aerospace and military environments. In comparison with clinical stand and tilt tests, LBNP represents a relatively safe methodology for inducing highly reproducible hemodynamic responses during exposure to footward fluid shifts similar to those experienced under orthostatic challenge. By maintaining an orthostatic challenge in a supine posture, removal of leg support (muscle pump) and head motion (vestibular stimuli) during LBNP provides the capability to isolate cardiovascular mechanisms that regulate blood pressure. LBNP can be used for physiological measurements, clinical diagnoses and investigational research comparisons of subject populations and alterations in physiological status. The applications of LBNP to the study of blood pressure regulation in spaceflight, groundbased simulations of low gravity, and hemorrhage have provided unique insights and understanding for development of countermeasures based on physiological mechanisms underlying the operational problems.
Zheng, Yihua; Kuznetsova, Maria M.; Pulkkinen, Antti A.; Maddox, Marlo M.; Mays, Mona Leila
The Space Weather Research Center (http://swrc. gsfc.nasa.gov) at NASA Goddard, part of the Community Coordinated Modeling Center (http://ccmc.gsfc.nasa.gov), is committed to providing research-based forecasts and notifications to address NASA's space weather needs, in addition to its critical role in space weather education. It provides a host of services including spacecraft anomaly resolution, historical impact analysis, real-time monitoring and forecasting, tailored space weather alerts and products, and weekly summaries and reports. In this paper, we focus on how (near) real-time data (both in space and on ground), in combination with modeling capabilities and an innovative dissemination system called the integrated Space Weather Analysis system (http://iswa.gsfc.nasa.gov), enable monitoring, analyzing, and predicting the spacecraft charging environment for spacecraft users. Relevant tools and resources are discussed.
Li, Linda C; Adam, Paul M; Townsend, Anne F; Lacaille, Diane; Yousefi, Charlene; Stacey, Dawn; Gromala, Diane; Shaw, Chris D; Tugwell, Peter; Backman, Catherine L
Decision aids are evidence-based tools designed to inform people of the potential benefit and harm of treatment options, clarify their preferences and provide a shared decision-making structure for discussion at a clinic visit. For patients with rheumatoid arthritis (RA) who are considering methotrexate, we have developed a web-based patient decision aid called the ANSWER (Animated, Self-serve, Web-based Research Tool). This study aimed to: 1) assess the usability of the ANSWER prototype; 2) identify strengths and limitations of the ANSWER from the patient's perspective. The ANSWER prototype consisted of: 1) six animated patient stories and narrated information on the evidence of methotrexate for RA; 2) interactive questionnaires to clarify patients' treatment preferences. Eligible participants for the usability test were patients with RA who had been prescribed methotrexate. They were asked to verbalize their thoughts (i.e., think aloud) while using the ANSWER, and to complete the System Usability Scale (SUS) to assess overall usability (range = 0-100; higher = more user friendly). Participants were audiotaped and observed, and field notes were taken. The testing continued until no new modifiable issues were found. We used descriptive statistics to summarize participant characteristics and the SUS scores. Content analysis was used to identified usability issues and navigation problems. 15 patients participated in the usability testing. The majority were aged 50 or over and were university/college graduates (n = 8, 53.4%). On average they took 56 minutes (SD = 34.8) to complete the tool. The mean SUS score was 81.2 (SD = 13.5). Content analysis of audiotapes and field notes revealed four categories of modifiable usability issues: 1) information delivery (i.e., clarity of the information and presentation style); 2) navigation control (i.e., difficulties in recognizing and using the navigation control buttons); 3) layout (i.e., position of the
Amon, Krestina L; Campbell, Andrew J; Hawke, Catherine; Steinbeck, Katharine
Researchers are increasingly using social media to recruit participants to surveys and clinical studies. However, the evidence of the efficacy and validity of adolescent recruitment through Facebook is yet to be established. To conduct a systematic review of the literature on the use of Facebook to recruit adolescents for health research. Nine electronic databases and reference lists were searched for articles published between 2004 and 2013. Studies were included in the review if: 1) participants were aged ≥ 10 to ≤ 18 years, 2) studies addressed a physical or mental health issue, 3) Facebook was identified as a recruitment tool, 4) recruitment details using Facebook were outlined in the methods section and considered in the discussion, or information was obtained by contacting the authors, 5) results revealed how many participants were recruited using Facebook, and 6) studies addressed how adolescent consent and/or parental consent was obtained. Titles, abstracts, and keywords were scanned and duplicates removed by 2 reviewers. Full text was evaluated for inclusion criteria, and 2 reviewers independently extracted data. The search resulted in 587 publications, of which 25 full-text papers were analyzed. Six studies met all the criteria for inclusion in the review. Three recruitment methods using Facebook was identified: 1) paid Facebook advertising, 2) use of the Facebook search tool, and 3) creation and use of a Facebook Page. Eligible studies described the use of paid Facebook advertising and Facebook as a search tool as methods to successfully recruit adolescent participants. Online and verbal consent was obtained from participants recruited from Facebook. Copyright © 2014 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.
Miller, Brian W.; Morisette, Jeffrey T.
Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.
Piet, S.J.; Dixon, B.W.; Bennett, R.G.; Smith, J.D.; Hill, R.N.
Given the range of fuel cycle goals and criteria, and the wide range of fuel cycle options, how can the set of options eventually be narrowed in a transparent and justifiable fashion? It is impractical to develop all options. We suggest an approach that starts by considering a range of goals for the Advanced Fuel Cycle Initiative (AFCI) and then posits seven questions, such as whether Cs and Sr isotopes should be separated from spent fuel and, if so, what should be done with them. For each question, we consider which of the goals may be relevant to eventually providing answers. The AFCI program has both ''outcome'' and ''process'' goals because it must address both waste already accumulating as well as completing the fuel cycle in connection with advanced nuclear power plant concepts. The outcome objectives are waste geologic repository capacity and cost, energy security and sustainability, proliferation resistance, fuel cycle economics, and safety. The process objectives are rea diness to proceed and adaptability and robustness in the face of uncertainties
Fourmond, V; Léger, C
This chapter presents the fundamentals of electrochemistry in the context of protein electrochemistry. We discuss redox proteins and enzymes that are not photoactive. Of course, the principles described herein also apply to photobioelectrochemistry, as discussed in later chapters of this book. Depending on which experiment is considered, electron transfer between proteins and electrodes can be either direct or mediated, and achieved in a variety of configurations: with the protein and/or the mediator free to diffuse in solution, immobilized in a thick, hydrated film, or adsorbed as a sub-monolayer on the electrode. The experiments can be performed with the goal to study the protein or to use it. Here emphasis is on mechanistic studies, which are easier in the configuration where the protein is adsorbed and electron transfer is direct, but we also explain the interpretation of signals obtained when diffusion processes affect the response.This chapter is organized as a series of responses to questions. Questions 1-5 are related to the basics of electrochemistry: what does "potential" or "current" mean, what does an electrochemical set-up look like? Questions 6-9 are related to the distinction between adsorbed and diffusive redox species. The answers to questions 10-13 explain the interpretation of slow and fast scan voltammetry with redox proteins. Questions 14-19 deal with catalytic electrochemistry, when the protein studied is actually an enzyme. Questions 20, 21 and 22 are general.
Madiedo, J. M.
The Virtual Museum for Meteorites (Figure 1) was created as a tool for students, educators and researchers [1, 2]. One of the aims of this online resource is to promote the interest in meteorites. Thus, the role of meteorites in education and outreach is fundamental, as these are very valuable tools to promote the public's interest in Astronomy and Planetary Sciences. Meteorite exhibitions reveal the fascination of students, educators and even researchers for these extraterrestrial rocks and how these can explain many key questions origin and evolution of our Solar System. However, despite the efforts related to the origin and evolution of our Solar System. However, despite the efforts of private collectors, museums and other institutions to organize meteorite exhibitions, the reach of these is usually limited. The Virtual Museum for Meteorites takes advantage of HTML and related technologies to overcome local boundaries and offer its contents for a global audience. A description of the recent developments performed in the framework of this virtual museum is given in this work.
Full Text Available Innovation and thus the production of knowledge becomes a factor of competitiveness. In this context quality management could be complemented by knowledge management to aim the improvement of knowledge production by research activities process. To this end, after describing knowledge and informa-tion typologies in engineering activities, a knowledge man-agement system is proposed. The goal is to support: (1 Semi-Structured Information (e.g. reports, etc. thanks to the BASIC-Lab tool functions, which are based on attributing points of view and annotations to documents and document zones, and (2 Non-Structured Information (such as mail, dialogues, etc., thanks to MICA-Graph approach which intends to support ex-change of technical messages that concerns common resolution of research problems within project teams and to capitalise relevant knowledge. For the both approaches, prototype tools have been developed and evaluated, primarily to feed back with manufacturing knowledge in the EADS industrial envi-ronment.
Theune, Mariet; van Schooten, B.W.; op den Akker, Hendrikus J.A.; Bosma, W.E.; Hofs, D.H.W.; Nijholt, Antinus; Krahmer, E.J.; van Hooijdonk, C.M.J.; Marsi, E.C.; Ruiz Miyarez, L.; Munoz Alvarado, A.; Alvarez Moreno, C.
We present the Dutch IMIX research programme on multimodal interaction, speech and language technology. We discuss our contributions to this programme in the form of two research projects, IMOGEN and VIDIAM, and the technical integration of the various modules developed by IMIX subprojects to build
Jijkoun, V.; de Rijke, M.; McDonald, S.; Tait, J.
Question answering systems aim to meet users' information needs by returning exact answers in response to a question. Traditional open domain question answering systems are built around a single pipeline architecture. In an attempt to exploit multiple resources as well as multiple answering
Ogao Patrick J
Full Text Available Abstract Background Ever since Dr. John Snow (1813–1854 used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping – all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation in exploring geospatial structures encompassing disease, urban and census mapping. Results Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred
Ogao, Patrick J
Ever since Dr. John Snow (1813-1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping--all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation) in exploring geospatial structures encompassing disease, urban and census mapping. Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities) background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred exploratory tool in identifying, interpreting and
Bickerstaffe, Adrian; Ranaweera, Thilina; Endersby, Travis; Ellis, Christopher; Maddumarachchi, Sanjaya; Gooden, George E; White, Paul; Moses, Eric K; Hewitt, Alex W; Hopper, John L
The Ark is an open-source web-based tool that allows researchers to manage health and medical research data for humans and animals without specialized database skills or programming expertise. The system provides data management for core research information including demographic, phenotype, biospecimen and pedigree data, in addition to supporting typical investigator requirements such as tracking participant consent and correspondence, whilst also being able to generate custom data exports and reports. The Ark is 'study generic' by design and highly configurable via its web interface, allowing researchers to tailor the system to the specific data management requirements of their study. Source code for The Ark can be obtained freely from the website https://github.com/The-Ark-Informatics/ark/ . The source code can be modified and redistributed under the terms of the GNU GPL v3 license. Documentation and a pre-configured virtual appliance can be found at the website http://sphinx.org.au/the-ark/ . email@example.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: firstname.lastname@example.org
Garaizar, Pablo; Reips, Ulf-Dietrich
Social networking has surpassed e-mail and instant messaging as the dominant form of online communication (Meeker, Devitt, & Wu, 2010). Currently, all large social networks are proprietary, making it difficult to impossible for researchers to make changes to such networks for the purpose of study design and access to user-generated data from the networks. To address this issue, the authors have developed and present Social Lab, an Internet-based free and open-source social network software system available from http://www.sociallab.es . Having full availability of navigation and communication data in Social Lab allows researchers to investigate behavior in social media on an individual and group level. Automated artificial users ("bots") are available to the researcher to simulate and stimulate social networking situations. These bots respond dynamically to situations as they unfold. The bots can easily be configured with scripts and can be used to experimentally manipulate social networking situations in Social Lab. Examples for setting up, configuring, and using Social Lab as a tool for research in social media are provided.
James S. Bates
Full Text Available Researchers, educators, and practitioners utilize a range of tools and techniques to obtain data, input, feedback, and information from research participants, program learners, and stakeholders. Ketso is both an array of information gathering techniques and a toolkit (see www.ketso.com. It “can be used in any situation when people come together to share information, learn from each other, make decisions and plan actions” (Tippett & How, 2011, p. 4. The word ketso means “action” in the Sesotho language, spoken in the African nation of Lesotho where the concept for this instrument was conceived. Ketso techniques fall into the participatory action research family of social science research methods (Tippett, Handley, & Ravetz, 2007. Ohio State University Extension professionals have used the Ketso toolkit and its techniques in numerous settings, including for professional development, conducting community needs/interests assessments, brainstorming, and data collection. As a toolkit, Ketso uses tactile and colorful leaves, branches, and icons to organize and display participants’ contributions on felt mats. As an array of techniques, Ketso is effective in engaging audiences because it is inclusive and provides each participant a platform for their perspective to be shared.
Methane hydrates create problems by blocking pipelines and casing; they are also accused of contributing to environmental problems (e.g. global warming). Methane hydrates are also found in permafrost areas and in oceanic sediments where the necessary temperature and pressure for stability occur. Claims for the widespread occurrence in thick oceanic deposits are unfounded: apparently indirect evidence from seismic reflectors, seismic hydrocarbon indicators, logs and free samples is unreliable. At one time, hydrate was seen as a static, biogenic, continuous, huge resource but that view is changing to one of a dynamic, overpressurised, discontinuous and unreliable resource. Only Japan and India are currently showing any serious interest in hydrates. Academic research has raised more questions than answers. It is suggested that more hard exploratory evidence rather than theoretical study is required
Veller, van M.G.P.; Gerritsma, W.
Wageningen UR Library has developed a tool based upon co-citation analysis to recommend alternative journals to researchers for a journal they look up in the tool. The journal recommendations can be tuned in such a way to include citation preferences for each of the five science groups that comprise
Snilstveit, Birte; Vojtkova, Martina; Bhavsar, Ami; Stevenson, Jennifer; Gaarder, Marie
A range of organizations are engaged in the production of evidence on the effects of health, social, and economic development programs on human welfare outcomes. However, evidence is often scattered around different databases, web sites, and the gray literature and is often presented in inaccessible formats. Lack of overview of the evidence in a specific field can be a barrier to the use of existing research and prevent efficient use of limited resources for new research. Evidence & Gap Maps (EGMs) aim to address these issues and complement existing synthesis and mapping approaches. EGMs are a new addition to the tools available to support evidence-informed policymaking. To provide an accessible resource for researchers, commissioners, and decision makers, EGMs provide thematic collections of evidence structured around a framework which schematically represents the types of interventions and outcomes of relevance to a particular sector. By mapping the existing evidence using this framework, EGMs provide a visual overview of what we know and do not know about the effects of different programs. They make existing evidence available, and by providing links to user-friendly summaries of relevant studies, EGMs can facilitate the use of existing evidence for decision making. They identify key "gaps" where little or no evidence from impact evaluations and systematic reviews is available and can be a valuable resource to inform a strategic approach to building the evidence base in a particular sector. The article will introduce readers to the concept and methods of EGMs and present a demonstration of the EGM tool using existing examples. Copyright Â© 2016 Elsevier Inc. All rights reserved.
Smartt, H.; Kuhn, M.; Krementz, D.
The U.S. National Nuclear Security Administration (NNSA) Office of Non-proliferation and Verification Research and Development currently funds research on advanced containment technologies to support Continuity of Knowledge (CoK) objectives for verification regimes. One effort in this area is the Advanced Tools for Maintaining Continuity of Knowledge (ATCK) project. Recognizing that CoK assurances must withstand potential threats from sophisticated adversaries, and that containment options must therefore keep pace with technology advances, the NNSA research and development on advanced containment tools is an important investment. The two ATCK efforts underway at present address the technical containment requirements for securing access points (loop seals) and protecting defined volumes. Multiple U.S. national laboratories are supporting this project: Sandia National Laboratories (SNL), Savannah River National Laboratory (SRNL), and Oak Ridge National Laboratory (ORNL). SNL and SRNL are developing the ''Ceramic Seal,'' an active loop seal that integrates multiple advanced security capabilities and improved efficiency housed within a small-volume ceramic body. The development includes an associated handheld reader and interface software. Currently at the prototype stage, the Ceramic Seal will undergo a series of tests to determine operational readiness. It will be field tested in a representative verification trial in 2016. ORNL is developing the Whole Volume Containment Seal (WCS), a flexible conductive fabric capable of enclosing various sizes and shapes of monitored items. The WCS includes a distributed impedance measurement system for imaging the fabric surface area and passive tamper-indicating features such as permanent-staining conductive ink. With the expected technology advances from the Ceramic Seal and WCS, the ATCK project takes significant steps in advancing containment technologies to help maintain CoK for various verification
Galuvao, Akata Sisigafu'aapulematumua
This article introduces Tofa'a'anolasi, a novel Samoan research framework created by drawing on the work of other Samoan and Pacific education researchers, in combination with adapting the 'Foucauldian tool box' to use for research carried out from a Samoan perspective. The article starts with an account and explanation of the process of…
Laursen, S. L.; Hunter, A.; Weston, T.; Thiry, H.
Evidence-based thinking is essential both to science and to the development of effective educational programs. Thus assessment of student learning—gathering evidence about the nature and depth of students’ learning gains, and about how they arise—is a centerpiece of any effective undergraduate research (UR) program. Assessment data can be used to monitor progress, to diagnose problems, to strengthen program designs, and to report both good outcomes and strategies to improve them to institutional and financial stakeholders in UR programs. While the positive impact of UR on students’ educational, personal and professional development has long been a matter of faith, only recently have researchers and evaluators developed an empirical basis by which to identify and explain these outcomes. Based on this growing body of evidence, URSSA, the Undergraduate Research Student Self-Assessment, is a survey tool that departments and programs can use to assess student outcomes of UR. URSSA focuses on what students learn from their UR experience, rather than whether they liked it. Both multiple-choice and open-ended items focus on students’ gains from UR, including: (1) skills such as lab work and communication; (2) conceptual knowledge and linkages among ideas in their field and with other fields; (3) deepened understanding of the intellectual and practical work of science; (4) growth in confidence and adoption of the identity of scientist; (5) preparation for a career or graduate school in science; and (6) greater clarity in understanding what career or educational path they might wish to pursue. Other items probe students’ participation in important activities that have been shown to lead to these gains; and a set of optional items can be included to probe specific program features that may supplement UR (e.g. field trips, career seminars, housing arrangements). The poster will describe URSSA's content, development, validation, and use. For more information about
De, Baishakhi; Bhandari, Koushik; Mukherjee, Ranjan; Katakam, Prakash; Adiki, Shanta K; Gundamaraju, Rohit; Mitra, Analava
The world has witnessed growing complexities in disease scenario influenced by the drastic changes in host-pathogen- environment triadic relation. Pharmaceutical R&Ds are in constant search of novel therapeutic entities to hasten transition of drug molecules from lab bench to patient bedside. Extensive animal studies and human pharmacokinetics are still the "gold standard" in investigational new drug research and bio-equivalency studies. Apart from cost, time and ethical issues on animal experimentation, burning questions arise relating to ecological disturbances, environmental hazards and biodiversity issues. Grave concerns arises when the adverse outcomes of continued studies on one particular disease on environment gives rise to several other pathogenic agents finally complicating the total scenario. Thus Pharma R&Ds face a challenge to develop bio-waiver protocols. Lead optimization, drug candidate selection with favorable pharmacokinetics and pharmacodynamics, toxicity assessment are vital steps in drug development. Simulation tools like Gastro Plus™, PK Sim®, SimCyp find applications for the purpose. Advanced technologies like organ-on-a chip or human-on-a chip where a 3D representation of human organs and systems can mimic the related processes and activities, thereby linking them to major features of human biology can be successfully incorporated in the drug development tool box. PBPK provides the State of Art to serve as an optional of animal experimentation. PBPK models can successfully bypass bio-equivalency studies, predict bioavailability, drug interactions and on hyphenation with in vitro-in vivo correlation can be extrapolated to humans thus serving as bio-waiver. PBPK can serve as an eco-friendly bio-waiver predictive tool in drug development. Copyright© Bentham Science Publishers; For any queries, please email at email@example.com.
Chen, Chunpeng James; Zhang, Zhiwu
The ultimate goal of genomic research is to effectively predict phenotypes from genotypes so that medical management can improve human health and molecular breeding can increase agricultural production. Genomic prediction or selection (GS) plays a complementary role to genome-wide association studies (GWAS), which is the primary method to identify genes underlying phenotypes. Unfortunately, most computing tools cannot perform data analyses for both GWAS and GS. Furthermore, the majority of these tools are executed through a command-line interface (CLI), which requires programming skills. Non-programmers struggle to use them efficiently because of the steep learning curves and zero tolerance for data formats and mistakes when inputting keywords and parameters. To address these problems, this study developed a software package, named the Intelligent Prediction and Association Tool (iPat), with a user-friendly graphical user interface. With iPat, GWAS or GS can be performed using a pointing device to simply drag and/or click on graphical elements to specify input data files, choose input parameters and select analytical models. Models available to users include those implemented in third party CLI packages such as GAPIT, PLINK, FarmCPU, BLINK, rrBLUP and BGLR. Users can choose any data format and conduct analyses with any of these packages. File conversions are automatically conducted for specified input data and selected packages. A GWAS-assisted genomic prediction method was implemented to perform genomic prediction using any GWAS method such as FarmCPU. iPat was written in Java for adaptation to multiple operating systems including Windows, Mac and Linux. The iPat executable file, user manual, tutorials and example datasets are freely available at http://zzlab.net/iPat. firstname.lastname@example.org.
Morguí, Josep-Anton; Font, Anna; Cañas, Lidia; Vázquez-García, Eusebi; Gini, Andrea; Corominas, Ariadna; Àgueda, Alba; Lobo, Agustin; Ferraz, Carlos; Nofuentes, Manel; Ulldemolins, Delmir; Roca, Alex; Kamnang, Armand; Grossi, Claudia; Curcoll, Roger; Batet, Oscar; Borràs, Silvia; Occhipinti, Paola; Rodó, Xavier
An educational tool was designed with the aim of making more comprehensive the research done on Greenhouse Gases (GHGs) in the ClimaDat Spanish network of atmospheric observation stations (www.climadat.es). This tool is called Air Enquirer and it consist of a multi-sensor box. It is envisaged to build more than two hundred boxes to yield them to the Spanish High Schools through the Education department (www.educaixa.com) of the "Obra Social 'La Caixa'", who funds this research. The starting point for the development of the Air Enquirers was the experience at IC3 (www.ic3.cat) in the CarboSchools+ FP7 project (www.carboschools.cat, www.carboschools.eu). The Air Enquirer's multi-sensor box is based in Arduino's architecture and contains sensors for CO2, temperature, relative humidity, pressure, and both infrared and visible luminance. The Air Enquirer is designed for taking continuous measurements. Every Air Enquirer ensemble of measurements is used to convert values to standard units (water content in ppmv, and CO2 in ppmv_dry). These values are referred to a calibration made with Cavity Ring Down Spectrometry (Picarro®) under different temperature, pressure, humidity and CO2 concentrations. Multiple sets of Air Enquirers are intercalibrated for its use in parallel during the experiments. The different experiments proposed to the students will be outdoor (observational) or indoor (experimental, in the lab) focusing on understanding the biogeochemistry of GHGs in the ecosystems (mainly CO2), the exchange (flux) of gases, the organic matter production, respiration and decomposition processes, the influence of the anthropogenic activities on the gases (and particles) exchanges, and their interaction with the structure and composition of the atmosphere (temperature, water content, cooling and warming processes, radiative forcing, vertical gradients and horizontal patterns). In order to ensure Air Enquirers a high-profile research performance the experimental designs
Blackledge, Matthew D; Collins, David J; Koh, Dow-Mu; Leach, Martin O
We present pyOsiriX, a plugin built for the already popular dicom viewer OsiriX that provides users the ability to extend the functionality of OsiriX through simple Python scripts. This approach allows users to integrate the many cutting-edge scientific/image-processing libraries created for Python into a powerful DICOM visualisation package that is intuitive to use and already familiar to many clinical researchers. Using pyOsiriX we hope to bridge the apparent gap between basic imaging scientists and clinical practice in a research setting and thus accelerate the development of advanced clinical image processing. We provide arguments for the use of Python as a robust scripting language for incorporation into larger software solutions, outline the structure of pyOsiriX and how it may be used to extend the functionality of OsiriX, and we provide three case studies that exemplify its utility. For our first case study we use pyOsiriX to provide a tool for smooth histogram display of voxel values within a user-defined region of interest (ROI) in OsiriX. We used a kernel density estimation (KDE) method available in Python using the scikit-learn library, where the total number of lines of Python code required to generate this tool was 22. Our second example presents a scheme for segmentation of the skeleton from CT datasets. We have demonstrated that good segmentation can be achieved for two example CT studies by using a combination of Python libraries including scikit-learn, scikit-image, SimpleITK and matplotlib. Furthermore, this segmentation method was incorporated into an automatic analysis of quantitative PET-CT in a patient with bone metastases from primary prostate cancer. This enabled repeatable statistical evaluation of PET uptake values for each lesion, before and after treatment, providing estaimes maximum and median standardised uptake values (SUVmax and SUVmed respectively). Following treatment we observed a reduction in lesion volume, SUVmax and SUVmed for
Grigoriev, S. N.; Bobrovskij, N. M.; Melnikov, P. A.; Bobrovskij, I. N.
Modern vector of development of machining technologies aimed at the transition to environmentally safe technologies - “green” technologies. The concept of “green technology” includes a set of signs of knowledge intended for practical use (“technology”). One of the ways to improve the quality of production is the use of surface plastic deformation (SPD) processing methods. The advantage of the SPD is a capability to combine effects of finishing and strengthening treatment. The SPD processing can replace operations: fine turning, grinding or polishing. The SPD is a forceful contact impact of indentor on workpiece’s surface in condition of their relative motion. It is difficult to implement the core technology of the SPD (burnishing, roller burnishing, etc.) while maintaining core technological advantages without the use of lubricating and cooling technology (metalworking fluids, MWF). The “green” SPD technology was developed by the authors for dry processing and has not such shortcomings. When processing with SPD without use of MWF requirements for tool’s durability is most significant, especially in the conditions of mass production. It is important to determine the period of durability of tool at the design stage of the technological process with the purpose of wastage preventing. This paper represents the results of durability research of natural and synthetic diamonds (polycrystalline diamond - ASPK) as well as precision of polycrystalline superabrasive tools made of dense boron nitride (DBN) during SPD processing without application of MWF.
Satpathy, R; Konkimalla, V B; Ratha, J
Microbial dehalogenation is a biochemical process in which the halogenated substances are catalyzed enzymatically in to their non-halogenated form. The microorganisms have a wide range of organohalogen degradation ability both explicit and non-specific in nature. Most of these halogenated organic compounds being pollutants need to be remediated; therefore, the current approaches are to explore the potential of microbes at a molecular level for effective biodegradation of these substances. Several microorganisms with dehalogenation activity have been identified and characterized. In this aspect, the bioinformatics plays a key role to gain deeper knowledge in this field of dehalogenation. To facilitate the data mining, many tools have been developed to annotate these data from databases. Therefore, with the discovery of a microorganism one can predict a gene/protein, sequence analysis, can perform structural modelling, metabolic pathway analysis, biodegradation study and so on. This review highlights various methods of bioinformatics approach that describes the application of various databases and specific tools in the microbial dehalogenation fields with special focus on dehalogenase enzymes. Attempts have also been made to decipher some recent applications of in silico modeling methods that comprise of gene finding, protein modelling, Quantitative Structure Biodegradibility Relationship (QSBR) study and reconstruction of metabolic pathways employed in dehalogenation research area.
Saadet Kuru Cetin
Full Text Available In this study, in-class lesson observations were made with volunteer teachers working in primary and secondary schools using alternative observation tools regarding the scope of contemporary educational supervision. The study took place during the fall and spring semesters of the 2015-2016 and 2016-2017 academic years and the class observations were made with six alternative volunteer teachers in the primary and secondary schools in the provincial and district centers using alternative observation tools. In the classroom observations, the teacher's verbal flow scheme, teacher's movement scheme and student behaviors both during tasks and not, were analyzed. Observations were made during the two classes with teacher's permission. After the first observation, an information meeting was held and then the second observation was made. Following the observations, interviews were held with the teachers. In interviews, the information about the class observations was shared with teachers and their opinions about research were asked. It has been found that alternative observations, in general, have a positive effect on the professional development of teachers. It is concluded that this type of observation approach positively affects teachers' in-class activities, helps in classroom management and teaching arrangements and positively affects student's unwanted behaviors.
Schockaert, Steven; Janssen, Jeroen; Vermeir, Dirk; de Cock, Martine
Since its introduction, answer set programming has been generalized in many directions, to cater to the needs of real-world applications. As one of the most general “classical” approaches, answer sets of arbitrary propositional theories can be defined as models in the equilibrium logic of Pearce. Fuzzy answer set programming, on the other hand, extends answer set programming with the capability of modeling continuous systems. In this paper, we combine the expressiveness of both approaches, and define answer sets of arbitrary fuzzy propositional theories as models in a fuzzification of equilibrium logic. We show that the resulting notion of answer set is compatible with existing definitions, when the syntactic restrictions of the corresponding approaches are met. We furthermore locate the complexity of the main reasoning tasks at the second level of the polynomial hierarchy. Finally, as an illustration of its modeling power, we show how fuzzy equilibrium logic can be used to find strong Nash equilibria.
Jankowski, Katherine R B; Flannelly, Kevin J; Flannelly, Laura T
The t-test developed by William S. Gosset (also known as Student's t-test and the two-sample t-test) is commonly used to compare one sample mean on a measure with another sample mean on the same measure. The outcome of the t-test is used to draw inferences about how different the samples are from each other. It is probably one of the most frequently relied upon statistics in inferential research. It is easy to use: a researcher can calculate the statistic with three simple tools: paper, pen, and a calculator. A computer program can quickly calculate the t-test for large samples. The ease of use can result in the misuse of the t-test. This article discusses the development of the original t-test, basic principles of the t-test, two additional types of t-tests (the one-sample t-test and the paired t-test), and recommendations about what to consider when using the t-test to draw inferences in research.
Vecchiato, Giovanni; Astolfi, Laura; De Vico Fallani, Fabrizio; Toppi, Jlenia; Aloise, Fabio; Bez, Francesco; Wei, Daming; Kong, Wanzeng; Dai, Jounging; Cincotti, Febo; Mattia, Donatella; Babiloni, Fabio
Here we present an overview of some published papers of interest for the marketing research employing electroencephalogram (EEG) and magnetoencephalogram (MEG) methods. The interest for these methodologies relies in their high-temporal resolution as opposed to the investigation of such problem with the functional Magnetic Resonance Imaging (fMRI) methodology, also largely used in the marketing research. In addition, EEG and MEG technologies have greatly improved their spatial resolution in the last decades with the introduction of advanced signal processing methodologies. By presenting data gathered through MEG and high resolution EEG we will show which kind of information it is possible to gather with these methodologies while the persons are watching marketing relevant stimuli. Such information will be related to the memorization and pleasantness related to such stimuli. We noted that temporal and frequency patterns of brain signals are able to provide possible descriptors conveying information about the cognitive and emotional processes in subjects observing commercial advertisements. These information could be unobtainable through common tools used in standard marketing research. We also show an example of how an EEG methodology could be used to analyze cultural differences between fruition of video commercials of carbonated beverages in Western and Eastern countries.
Full Text Available Here we present an overview of some published papers of interest for the marketing research employing electroencephalogram (EEG and magnetoencephalogram (MEG methods. The interest for these methodologies relies in their high-temporal resolution as opposed to the investigation of such problem with the functional Magnetic Resonance Imaging (fMRI methodology, also largely used in the marketing research. In addition, EEG and MEG technologies have greatly improved their spatial resolution in the last decades with the introduction of advanced signal processing methodologies. By presenting data gathered through MEG and high resolution EEG we will show which kind of information it is possible to gather with these methodologies while the persons are watching marketing relevant stimuli. Such information will be related to the memorization and pleasantness related to such stimuli. We noted that temporal and frequency patterns of brain signals are able to provide possible descriptors conveying information about the cognitive and emotional processes in subjects observing commercial advertisements. These information could be unobtainable through common tools used in standard marketing research. We also show an example of how an EEG methodology could be used to analyze cultural differences between fruition of video commercials of carbonated beverages in Western and Eastern countries.
Here we provide an update on construction of the five NEON Mobile Deployment Platforms (MDPs) as well as a description of the infrastructure and sensors available to researchers in the near future. Additionally, we include information (i.e. timelines and procedures) on requesting MDPs for PI led projects. The MDPs will provide the means to observe stochastic or spatially important events, gradients, or quantities that cannot be reliably observed using fixed location sampling (e.g. fires and floods). Due to the transient temporal and spatial nature of such events, the MDPs are designed to accommodate rapid deployment for time periods up to 1 year. Broadly, the MDPs are comprised of infrastructure and instrumentation capable of functioning individually or in conjunction with one another to support observations of ecological change, as well as education, training and outreach. More specifically, the MDPs include the capability to make tower based measures of ecosystem exchange, radiation, and precipitation in conjunction with baseline soils data such as CO2 flux, and soil temperature and moisture. An aquatics module is also available with the MDP to facilitate research integrating terrestrial and aquatic processes. Ultimately, the NEON MDPs provides a tool for linking PI led research to the continental scale data sets collected by NEON.
Supreet Kaur Gill
Full Text Available Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.
Goldfarb, L.; Yang, A.
Leah Goldfarb, Paul Cutler, Andrew Yang*, Mustapha Mokrane, Jacinta Legg and Deliang Chen The scientific community has been engaged in developing an international strategy on Earth system research. The initial consultation in this “visioning” process focused on gathering suggestions for Earth system research priorities that are interdisciplinary and address the most pressing societal issues. It was implemented this through a website that utilized Web 2.0 capabilities. The website (http://www.icsu-visioning.org/) collected input from 15 July to 1 September 2009. This consultation was the first in which the international scientific community was asked to help shape the future of a research theme. The site attracted over 7000 visitors from 133 countries, more than 1000 of whom registered and took advantage of the site’s functionality to contribute research questions (~300 questions), comment on posts, and/or vote on questions. To facilitate analysis of results, the site captured a small set of voluntary information about each contributor and their contribution. A group of ~50 international experts were invited to analyze the inputs at a “Visioning Earth System Research” meeting held in September 2009. The outcome of this meeting—a prioritized list of research questions to be investigated over the next decade—was then posted on the visioning website for additional comment from the community through an online survey tool. In general, many lessons were learned in the development and implementation of this website, both in terms of the opportunities offered by Web 2.0 capabilities and the application of these capabilities. It is hoped that this process may serve as a model for other scientific communities. The International Council for Science (ICSU) in cooperation with the International Social Science Council (ISSC) is responsible for organizing this Earth system visioning process.
Binello, E.; Mitchell, R.N.; Harling, O.K.
An immunologic tool based on manipulation of the boron neutron capture reaction was previously proposed in the context of heart transplantation research to examine the temporal relationship between parenchymal rejection (representing immune cell infiltration) and transplantation-associated arteriosclerosis (characterized by progressive vascular occlusion). Critical to the development of this method is the uptake of boron by specific cells of the immune system, namely T cells, without adverse effects on cell function, which may be assessed by the ability of boron-loaded cells to produce IFNγ, a protein with substantial impact on rejection. This work presents the evaluation of two carboranyl thymidine analogs. Advantages of this type of boron compound are reduced risk of leakage and effective dose delivery based on their incorporation into cellular nuclear material. Results indicate that uptake of these boronated nucleosides is high with no adverse effects on cell function, thereby warranting the continued development of this technique that has potentially wide applicability in immunological models
The report includes the following chapters: (1) Introduction: ozone in the atmosphere, anthropogenic influence on the ozone layer, polar stratospheric ozone loss; (2) Tracer-tracer relations in the stratosphere: tracer-tracer relations as a tool in atmospheric research; impact of cosmic-ray-induced heterogeneous chemistry on polar ozone; (3) quantifying polar ozone loss from ozone-tracer relations: principles of tracer-tracer correlation techniques; reference ozone-tracer relations in the early polar vortex; impact of mixing on ozone-tracer relations in the polar vortex; impact of mesospheric intrusions on ozone-tracer relations in the stratospheric polar vortex calculation of chemical ozone loss in the arctic in March 2003 based on ILAS-II measurements; (4) epilogue.
Nelson, Douglas G; Byus, Kent
Contemporary public health requires the support and participation of its constituency. This study assesses the capacity of consumption value theory to identify the basis of this support. A telephone survey design used simple random sampling of adult residents of Cherokee County, Oklahoma. Factor analysis and stepwise discriminant analysis was used to identify and classify personal and societal level support variables. Most residents base societal level support on epistemic values. Direct services clientele base their support on positive emotional values derived from personal contact and attractive programs. Residents are curious about public health and want to know more about the health department. Where marketing the effectiveness of public health programs would yield relatively little support, marketing health promotion activities may attract public opposition. This formative research tool suggests a marketing strategy for public health practitioners.
Full Text Available Single molecule studies have expanded rapidly over the past decade and have the ability to provide an unprecedented level of understanding of biological systems. A common challenge upon introduction of novel, data-rich approaches is the management, processing, and analysis of the complex data sets that are generated. We provide a standardized approach for analyzing these data in the freely available software package SMART: Single Molecule Analysis Research Tool. SMART provides a format for organizing and easily accessing single molecule data, a general hidden Markov modeling algorithm for fitting an array of possible models specified by the user, a standardized data structure and graphical user interfaces to streamline the analysis and visualization of data. This approach guides experimental design, facilitating acquisition of the maximal information from single molecule experiments. SMART also provides a standardized format to allow dissemination of single molecule data and transparency in the analysis of reported data.
Full Text Available Human pluripotent stem cells (hPSCs, namely, embryonic stem cells (ESCs and induced pluripotent stem cells (iPSCs, with their ability of indefinite self-renewal and capability to differentiate into cell types derivatives of all three germ layers, represent a powerful research tool in developmental biology, for drug screening, disease modelling, and potentially cell replacement therapy. Efficient differentiation protocols that would result in the cell type of our interest are needed for maximal exploitation of these cells. In the present work, we aim at focusing on the protocols for differentiation of hPSCs into functional cardiomyocytes in vitro as well as achievements in the heart disease modelling and drug testing on the patient-specific iPSC-derived cardiomyocytes (iPSC-CMs.
Verma, A.K.; Varde, P.V.; Sankar, S.; Prakash, P.
A prototype Knowledge Based (KB) operator Adviser (OPAD) system has been developed for 100 MW(th) Heavy Water moderated, cooled and Natural Uranium fueled research reactor. The development objective of this system is to improve reliability of operator action and hence the reactor safety at the time of crises as well as normal operation. The jobs performed by this system include alarm analysis, transient identification, reactor safety status monitoring, qualitative fault diagnosis and procedure generation in reactor operation. In order to address safety objectives at various stages of the Operator Adviser (OPAD) system development the Knowledge has been structured using PSA tools/information in an shell environment. To demonstrate the feasibility of using a combination of KB approach with PSA for operator adviser system, salient features of some of the important modules (viz. FUELEX, LOOPEX and LOCAEX) have been discussed. It has been found that this system can serve as an efficient operator support system
Gholami, Jaleh; Majdzadeh, Reza; Nedjat, Saharnaz; Nedjat, Sima; Maleki, Katayoun; Ashoorkhani, Mahnaz; Yazdizadeh, Bahareh
The knowledge translation self-assessment tool for research institutes (SATORI) was designed to assess the status of knowledge translation in research institutes. The objective was, to identify the weaknesses and strengths of knowledge translation in research centres and faculties associated with Tehran University of Medical Sciences (TUMS). The tool, consisting of 50 statements in four main domains, was used in 20 TUMS-affiliated research centres and departments after its reliability was established. It was completed in a group discussion by the members of the research council, researchers and research users' representatives from each centre and/or department. The mean score obtained in the four domains of 'The question of research', 'Knowledge production', 'Knowledge transfer' and 'Promoting the use of evidence' were 2.26, 2.92, 2 and 1.89 (out of 5) respectively.Nine out of 12 interventional priorities with the lowest quartile score were related to knowledge transfer resources and strategies, whereas eight of them were in the highest quartile and related to 'The question of research' and 'Knowledge production'. The self-assessment tool identifies the gaps in capacity and infrastructure of knowledge translation support within research organizations. Assessment of research institutes using SATORI pointed out that strengthening knowledge translation through provision of financial support for knowledge translation activities, creating supportive and facilitating infrastructures, and facilitating interactions between researchers and target audiences to exchange questions and research findings are among the priorities of research centres and/or departments.
Akl, Elie A; Fadlallah, Racha; Ghandour, Lilian; Kdouh, Ola; Langlois, Etienne; Lavis, John N; Schünemann, Holger; El-Jardali, Fadi
Groups or institutions funding or conducting systematic reviews in health policy and systems research (HPSR) should prioritise topics according to the needs of policymakers and stakeholders. The aim of this study was to develop and validate a tool to prioritise questions for systematic reviews in HPSR. We developed the tool following a four-step approach consisting of (1) the definition of the purpose and scope of tool, (2) item generation and reduction, (3) testing for content and face validity, (4) and pilot testing of the tool. The research team involved international experts in HPSR, systematic review methodology and tool development, led by the Center for Systematic Reviews on Health Policy and Systems Research (SPARK). We followed an inclusive approach in determining the final selection of items to allow customisation to the user's needs. The purpose of the SPARK tool was to prioritise questions in HPSR in order to address them in systematic reviews. In the item generation and reduction phase, an extensive literature search yielded 40 relevant articles, which were reviewed by the research team to create a preliminary list of 19 candidate items for inclusion in the tool. As part of testing for content and face validity, input from international experts led to the refining, changing, merging and addition of new items, and to organisation of the tool into two modules. Following pilot testing, we finalised the tool, with 22 items organised in two modules - the first module including 13 items to be rated by policymakers and stakeholders, and the second including 9 items to be rated by systematic review teams. Users can customise the tool to their needs, by omitting items that may not be applicable to their settings. We also developed a user manual that provides guidance on how to use the SPARK tool, along with signaling questions. We have developed and conducted initial validation of the SPARK tool to prioritise questions for systematic reviews in HPSR, along with
Torres, Samantha; de la Riva, Erika E; Tom, Laura S; Clayman, Marla L; Taylor, Chirisse; Dong, Xinqi; Simon, Melissa A
Despite increasing need to boost the recruitment of underrepresented populations into cancer trials and biobanking research, few tools exist for facilitating dialogue between researchers and potential research participants during the recruitment process. In this paper, we describe the initial processes of a user-centered design cycle to develop a standardized research communication tool prototype for enhancing research literacy among individuals from underrepresented populations considering enrollment in cancer research and biobanking studies. We present qualitative feedback and recommendations on the prototype's design and content from potential end users: five clinical trial recruiters and ten potential research participants recruited from an academic medical center. Participants were given the prototype (a set of laminated cards) and were asked to provide feedback about the tool's content, design elements, and word choices during semi-structured, in-person interviews. Results suggest that the prototype was well received by recruiters and patients alike. They favored the simplicity, lay language, and layout of the cards. They also noted areas for improvement, leading to card refinements that included the following: addressing additional topic areas, clarifying research processes, increasing the number of diverse images, and using alternative word choices. Our process for refining user interfaces and iterating content in early phases of design may inform future efforts to develop tools for use in clinical research or biobanking studies to increase research literacy.
Riggs, E. M.
beginners. Thus researchers must embrace the uncontrolled nature of the setting, the qualitative nature of the data collected, and the researcher's role in interpreting geologically appropriate actions as evidence of successful problem solving and investigation. Working to understand the role of diversity and culture in the geosciences also involves a wide array of theory, from affective issues through culturally and linguistically-influenced cognition, through gender, self-efficacy, and many other areas of inquiry. Research in understanding spatial skills draws heavily on techniques from cognition research but also must involve the field-specific knowledge of geoscientists to infuse these techniques with exemplars, a catalog of meaningful actions by students, and an understanding of how to recognize success. These examples illustrate briefly the wide array of tools from other fields that is being brought to bear to advance rigorous geoscience education research. We will illustrate a few of these and the insights we have gained, and the power of theory and method from other fields to enlighten us as we attempt to educate a broader array of earth scientists.
Fumagalli, E.; Verdelli, G.
ISMES (Experimental Institute for Models and Structures) is now carrying out a series of tests on physical models as a part of a research programme sponsored by DSR (Studies and Research Direction) of ENEL (Italian State Electricity Board) on behalf of CPN (Nuclear Design and Construction Centre) of ENEL with the aim to experience a 'Thin'-walled PCPV for 'BWR'. The physical model, together with the mathematical model and the rheological model of the materials, is intended as a meaningful design tool. The mathematical model covers the overall structural design phase, (geometries) and the linear behaviour, whereas the physical model, besides of a global information to be compared with the results of the mathematical model, supplies a number of data as the non-linear behaviour up to failure and local conditions (penetration area etc.) are concerned. The aim of the first phase of this research programme is to make a comparison between the calculation and experiment tests as the thicknesses of the wall and the bottom slab are concerned, whereas the second phase of the research deals with the behaviour of the removable lid and its connection with the main structure. To do this, a model in scale 1:10 has been designed which symmetrically reproduces with respect to the equator, the bottom part of the structure. In the bottom slab the penetrations of the prototype design are reproduced, whereas the upper slab is plain. This paper describes the model, and illustrates the main results, underlining the different behaviour of the upper and bottom slabs up to collapse
Full Text Available Digital tool making offers many challenges, involving much trial and error. Developing machine learning and assistance in automated and semi-automated Internet resource discovery, metadata generation, and rich-text identification provides opportunities for great discovery, innovation, and the potential for transformation of the library community. The areas of computer science involved, as applied to the library applications addressed, are among that discipline’s leading edges. Making applied research practical and applicable, through placement within library/collection-management systems and services, involves equal parts computer scientist, research librarian, and legacy-systems archaeologist. Still, the early harvest is there for us now, with a large harvest pending. Data Fountains and iVia, the projects discussed, demonstrate this. Clearly, then, the present would be a good time for the library community to more proactively and significantly engage with this technology and research, to better plan for its impacts, to more proactively take up the challenges involved in its exploration, and to better and more comprehensively guide effort in this new territory. The alternative to doing this is that others will develop this territory for us, do it not as well, and sell it back to us at a premium. Awareness of this technology and its current capabilities, promises, limitations, and probable major impacts needs to be generalized throughout the library management, metadata, and systems communities. This article charts recent work, promising avenues for new research and development, and issues the library community needs to understand.
Holmes, Bruce J.; Sawhill, Bruce K.; Herriot, James; Seehart, Ken; Zellweger, Dres; Shay, Rick
The objective of this research by NextGen AeroSciences, LLC is twofold: 1) to deliver an initial "toolbox" of algorithms, agent-based structures, and method descriptions for introducing trajectory agency as a methodology for simulating and analyzing airspace states, including bulk properties of large numbers of heterogeneous 4D aircraft trajectories in a test airspace -- while maintaining or increasing system safety; and 2) to use these tools in a test airspace to identify possible phase transition structure to predict when an airspace will approach the limits of its capacity. These 4D trajectories continuously replan their paths in the presence of noise and uncertainty while optimizing performance measures and performing conflict detection and resolution. In this approach, trajectories are represented as extended objects endowed with pseudopotential, maintaining time and fuel-efficient paths by bending just enough to accommodate separation while remaining inside of performance envelopes. This trajectory-centric approach differs from previous aircraft-centric distributed approaches to deconfliction. The results of this project are the following: 1) we delivered a toolbox of algorithms, agent-based structures and method descriptions as pseudocode; and 2) we corroborated the existence of phase transition structure in simulation with the addition of "early warning" detected prior to "full" airspace. This research suggests that airspace "fullness" can be anticipated and remedied before the airspace becomes unsafe.
Wilson, G.E.; Boyack, B.E.
Best Estimate computer codes have been accepted by the US Nuclear Regulatory Commission as an optional tool for performing safety analysis related to the licensing and regulation of current nuclear reactors producing commercial electrical power, providing their uncertainty is quantified. In support of this policy change, the NRC and its contractors and consultants have developed and demonstrated an uncertainty quantification methodology called CSAU. At the process level, the method is generic to any application which relies on best estimate computer code simulations to determine safe operating margins. The primary use of the CSAU methodology is to quantify safety margins for existing designs; however, the methodology can also serve an equally important role in advanced reactor research for plants not yet built. Applied early, during the period when alternate designs are being evaluated, the methodology can identify the relative importance of the sources of uncertainty in the knowledge of each plant behavior and, thereby, help prioritize the research needed to bring the new designs to fruition. This paper describes the CSAU methodology, at the generic process level, and provides the general principles whereby it may be applied to evaluations of advanced reactor designs. 9 refs., 1 fig., 1 tab
Full Text Available The purpose of this paper is to showcase the information literacy course for doctoral students called Information Resources and Tools for Research. Turku University Library organises this course in collaboration with the University of Turku Graduate School. The course, which was started in 2012, has been organised four times so far, twice in English and twice in Finnish. The course offers training to all doctoral Programs in all of the seven disciplines present at the University of Turku and doctoral candidates of the University. In our presentation we will describe the structure and contents of the course and share our experiences of the collaboration with the University of Turku Graduate School. In addition, we will describe how the information specialists of the Turku University Library have collaborated during the course. We will also discuss the challenges of the course. Based on the course feedback, it can be stated that in general, participants have found this course very useful for their research in the University of Turku.
Full Text Available Rutgers Cooperative Extension developed an online self-assessment tool called the Personal Health and Finance Quiz available at http://njaes.rutgers.edu/money/health-finance-quiz/. Believed to be among the first public surveys to simultaneously query users about their health and personal finance practices, the quiz is part of Small Steps to Health and Wealth™ (SSHW, a Cooperative Extension program developed to motivate Americans to take action to improve both their health and personal finances (see http://njaes.rutgers.edu/sshw/. Respondents indicate one of four frequencies for performance of 20 daily activities and receive a Health, Finance, and Total score indicating their frequency of performing activities that health and financial experts recommend. In addition to providing users with personalized feedback, the quiz collects data for research about the health and financial practices of Americans to inform future Extension outreach and can be used as a pre-/post-test to evaluate the impact of SSHW programs. Initial research analyses are planned for 2015.
Roysri, Krisana; Chotipanich, Chanisa; Laopaiboon, Vallop; Khiewyoo, Jiraporn
Diagnostic nuclear medicine is being increasingly employed in clinical practice with the advent of new technologies and radiopharmaceuticals. The report of the prevalence of a certain disease is important for assessing the quality of that article. Therefore, this study was performed to evaluate the quality of published nuclear medicine articles and determine the frequency of reporting the prevalence of studied diseases. We used Standards for Reporting of Diagnostic Accuracy (STARD) and Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2) checklists for evaluating the quality of articles published in five nuclear medicine journals with the highest impact factors in 2012. The articles were retrieved from Scopus database and were selected and assessed independently by two nuclear medicine physicians. Decision concerning equivocal data was made by consensus between the reviewers. The average STARD score was approximately 17 points, and the highest score was 17.19±2.38 obtained by the European Journal of Nuclear Medicine. QUADAS-2 tool showed that all journals had low bias regarding study population. The Journal of Nuclear Medicine had the highest score in terms of index test, reference standard, and time interval. Lack of clarity regarding the index test, reference standard, and time interval was frequently observed in all journals including Clinical Nuclear Medicine, in which 64% of the studies were unclear regarding the index test. Journal of Nuclear Cardiology had the highest number of articles with appropriate reference standard (83.3%), though it had the lowest frequency of reporting disease prevalence (zero reports). All five journals had the same STARD score, while index test, reference standard, and time interval were very unclear according to QUADAS-2 tool. Unfortunately, data were too limited to determine which journal had the lowest risk of bias. In fact, it is the author's responsibility to provide details of research methodology so that the
van Vught, Frans; Westerheijden, Don F.
This paper sets out to analyse the need for better "transparency tools" which inform university stakeholders about the quality of universities. First, we give an overview of what we understand by the concept of transparency tools and those that are currently available. We then critique current transparency tools' methodologies, looking in detail…
Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom
This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…
How old is zero? That question has opened up a row between an international group of researchers and the University of Oxford after the Bodleian Library in Oxford noted that an ancient Indian text, known as the Bakhshali manuscript, had been dated to between 300 and 900 CE.
Western Interstate Commission for Higher Education, Boulder, CO. National Center for Higher Education Management Systems.
With some justification, the inability to answer most of the important questions in higher education is due to the lack of necessary information. But careful examination of our many faceted questions suggests that more information may not be the only answer. The National Center for Higher Education Management Systems (NCHEMS) has found other…
Sotaridona, Leonardo; Meijer, R.R.
Two new indices to detect answer copying on a multiple-choice test—S1 and S2—were proposed. The S1 index is similar to the K index (Holland, 1996) and the K2 index (Sotaridona & Meijer, 2002) but the distribution of the number of matching incorrect answers of the source and the copier is modeled by
... Initial Pleadings § 3030.14 Answer contents. (a) An answer must: (1) Contain a clear and concise statement... complainant and the Commission fully and completely of the nature of any defense, including factual... as such and presented separately from any denials; (5) State the nature of the evidentiary support...
Environmental Protection Agency, Washington, DC.
This pamphlet is designed to answer many of the questions that have arisen about nuclear power plants and the environment. It is organized into a question and answer format, with the questions taken from those most often asked by the public. Topics include regulation of nuclear power sources, potential dangers to people's health, whether nuclear…
... español Preguntas y respuestas sobre sexo Answering their kids' questions about sex is a responsibility that many parents dread. Otherwise ... avoided. Parents can help foster healthy feelings about sex if they answer kids' questions in an age-appropriate way. When do ...
Nazem, Amir; Mansoori, G Ali
A century of research has passed since the discovery and definition of Alzheimer's disease (AD), the primary common dementing disorder worldwide. However, AD lacks definite diagnostic approaches and effective cure at the present. Moreover, the currently available diagnostic tools are not sufficient for an early screening of AD in order to start preventive approaches. Recently the emerging field of nanotechnology has promised new techniques to solve some of the AD challenges. Nanotechnology refers to the techniques of designing and manufacturing nanosize (1-100 nm) structures through controlled positional and/or self-assembly of atoms and molecules. In this report, we present the promises that nanotechnology brings in research on the AD diagnosis and therapy. They include its potential for the better understanding of the AD root cause molecular mechanisms, AD's early diagnoses, and effective treatment. The advances in AD research offered by the atomic force microscopy, single molecule fluorescence microscopy and NanoSIMS microscopy are examined here. In addition, the recently proposed applications of nanotechnology for the early diagnosis of AD including bio-barcode assay, localized surface plasmon resonance nanosensor, quantum dot and nanomechanical cantilever arrays are analyzed. Applications of nanotechnology in AD therapy including neuroprotections against oxidative stress and anti-amyloid therapeutics, neuroregeneration and drug delivery beyond the blood brain barrier (BBB) are discussed and analyzed. All of these applications could improve the treatment approach of AD and other neurodegenerative diseases. The complete cure of AD may become feasible by a combination of nanotechnology and some other novel approaches, like stem cell technology.
As organizations consider connecting to the Internet, the issue of internetwork security becomes more important. There are many tools and components that can be used to secure a network, one of which is a firewall. Modern firewalls offer highly flexible private network security by controlling and monitoring all communications passing into or out of the private network. Specifically designed for security, firewalls become the private network's single point of attack from Internet intruders. Application gateways (or proxies) that have been written to be secure against even the most persistent attacks ensure that only authorized users and services access the private network. One-time passwords prevent intruders from `sniffing' and replaying the usernames and passwords of authorized users to gain access to the private network. Comprehensive logging permits constant and uniform system monitoring. `Address spoofing' attacks are prevented. The private network may use registered or unregistered IP addresses behind the firewall. Firewall-to-firewall encryption establishes a `virtual private network' across the Internet, preventing intruders from eavesdropping on private communications, eliminating the need for costly dedicated lines.
Amin, Waqas; Kang, Hyunseok P; Egloff, Ann Marie; Singh, Harpreet; Trent, Kerry; Ridge-Hetrick, Jennifer; Seethala, Raja R; Grandis, Jennifer; Parwani, Anil V
The Specialized Program of Research Excellence (SPORE) in Head and Neck Cancer neoplasm virtual biorepository is a bioinformatics-supported system to incorporate data from various clinical, pathological, and molecular systems into a single architecture based on a set of common data elements (CDEs) that provides semantic and syntactic interoperability of data sets. The various components of this annotation tool include the Development of Common Data Elements (CDEs) that are derived from College of American Pathologists (CAP) Checklist and North American Association of Central Cancer Registries (NAACR) standards. The Data Entry Tool is a portable and flexible Oracle-based data entry device, which is an easily mastered web-based tool. The Data Query Tool helps investigators and researchers to search de-identified information within the warehouse/resource through a 'point and click' interface, thus enabling only the selected data elements to be essentially copied into a data mart using a multi dimensional model from the warehouse's relational structure. The SPORE Head and Neck Neoplasm Database contains multimodal datasets that are accessible to investigators via an easy to use query tool. The database currently holds 6553 cases and 10607 tumor accessions. Among these, there are 965 metastatic, 4227 primary, 1369 recurrent, and 483 new primary cases. The data disclosure is strictly regulated by user's authorization. The SPORE Head and Neck Neoplasm Virtual Biorepository is a robust translational biomedical informatics tool that can facilitate basic science, clinical, and translational research. The Data Query Tool acts as a central source providing a mechanism for researchers to efficiently find clinically annotated datasets and biospecimens that are relevant to their research areas. The tool protects patient privacy by revealing only de-identified data in accordance with regulations and approvals of the IRB and scientific review committee
Sinharay, Sandip; Duong, Minh Q.; Wood, Scott W.
As noted by Fremer and Olson, analysis of answer changes is often used to investigate testing irregularities because the analysis is readily performed and has proven its value in practice. Researchers such as Belov, Sinharay and Johnson, van der Linden and Jeon, van der Linden and Lewis, and Wollack, Cohen, and Eckerly have suggested several…
Question Answering in Exams is typical question answering task that aims to test how accurately the model could answer the questions in exams. In this paper, we use general deep learning model to solve the multi-choice question answering task. Our approach is to build distributed word embedding of question and answers instead of manually extracting features or linguistic tools, meanwhile, for improving the accuracy, the external corpus is introduced. The framework uses a two layers LSTM with attention which get a significant result. By contrast, we introduce the simple long short-term memory (QA-LSTM) model and QA-LSTM-CNN model and QA-LSTM with attention model as the reference. Experiment demonstrate superior performance of two layers LSTM with attention compared to other models in question answering task.
Gobeill, Julien; Gaudinat, Arnaud; Pasche, Emilie; Vishnyakova, Dina; Gaudet, Pascale; Bairoch, Amos; Ruch, Patrick
Biomedical professionals have access to a huge amount of literature, but when they use a search engine, they often have to deal with too many documents to efficiently find the appropriate information in a reasonable time. In this perspective, question-answering (QA) engines are designed to display answers, which were automatically extracted from the retrieved documents. Standard QA engines in literature process a user question, then retrieve relevant documents and finally extract some possible answers out of these documents using various named-entity recognition processes. In our study, we try to answer complex genomics questions, which can be adequately answered only using Gene Ontology (GO) concepts. Such complex answers cannot be found using state-of-the-art dictionary- and redundancy-based QA engines. We compare the effectiveness of two dictionary-based classifiers for extracting correct GO answers from a large set of 100 retrieved abstracts per question. In the same way, we also investigate the power of GOCat, a GO supervised classifier. GOCat exploits the GOA database to propose GO concepts that were annotated by curators for similar abstracts. This approach is called deep QA, as it adds an original classification step, and exploits curated biological data to infer answers, which are not explicitly mentioned in the retrieved documents. We show that for complex answers such as protein functional descriptions, the redundancy phenomenon has a limited effect. Similarly usual dictionary-based approaches are relatively ineffective. In contrast, we demonstrate how existing curated data, beyond information extraction, can be exploited by a supervised classifier, such as GOCat, to massively improve both the quantity and the quality of the answers with a +100% improvement for both recall and precision. Database URL: http://eagl.unige.ch/DeepQA4PA/. © The Author(s) 2015. Published by Oxford University Press.
Zhang, Yin; Deng, Shengli
Introduction: In recent years, the introduction of social question and answer services and other Internet tools have expanded the ways in which people have their questions answered. There has been speculation and debate over whether such services and other Internet tools are replacing library virtual reference services. Method: Most previous…
Eiter, Thomas; Fink, Michael; Woltran, Stefan
In recent research on non-monotonic logic programming, repeatedly strong equivalence of logic programs P and Q has been considered, which holds if the programs P union R and Q union R have the same answer sets for any other program R. This property strengthens equivalence of P and Q with respect to answer sets (which is the particular case for R is the empty set), and has its applications in program optimization, verification, and modular logic programming. In this paper, we consider more lib...
PET imaging has for many years been a versatile tool for non-invasive imaging of neuro-physiology and, indeed, whole body physiology. Quantitative PET imaging of trace amounts of radioactivity is scientifically elegant and can be very complex. This lecture focuses on whether and where this test is clinically useful. Because of the research tradition, PET imaging has been perceived as an 'expensive' test, as it costs more per scan than CT and MRI scans at most institutions. Such a superficial analysis is incorrect, however, as it is increasingly recognized that imaging costs, which in some circumstances will be increased by the use of PET, are only a relatively small component of patient care costs. Thus, PET may raise imaging costs and the number of imaging procedures in some settings, though PET may reduce imaging test numbers in other settings. However, the analysis must focus on the total costs of patient management. Analyses focused on total patient care costs, including cost of hospitalization and cost surgery as well as imaging costs, have shown that PET can substantially reduce total patient care costs in several settings. This is achieved by providing a more accurate diagnosis, and thus having fewer instances of an incorrect diagnosis resulting in subsequent inappropriate surgery or investigations. Several institutions have shown scenarios in which PET for tumor imaging is cost effective. While the specific results of the analyses vary based on disease prevalence and cost input values for each procedure, as well as the projected performance of PET, the similar results showing total care cost savings in the management of several common cancers, strongly supports the rational for the use of PET in cancer management. In addition, promising clinical results are forthcoming in several other illnesses, suggesting PET will have broader utility than these uses, alone. Thus, while PET is an 'expensive' imaging procedure and has considerable utility as a research
A descriptive qualitative research design was used to determine whether participants ... simulation as a teaching method; a manikin offering effective learning; confidence ..... Tesch R. Qualitative Research: Analysis Types and Software Tools.
Quality of Online Chat Reference Answers Differ between Local and Consortium Library Staff: Providing Consortium Staff with More Local Information Can Mitigate these Differences. A Review of: Meert, D.L., & Given, L.M. (2009. Measuring quality in chat reference consortia: A comparative analysis of responses to users’ queries.” College & Research Libraries, 70(1, 71‐84.
Laura Newton Miller
standards 82% of the time. The groups showed the most significant differences when separated into the question categories. Local library staff met the standards for “Library User Information” questions 97% of the time, while consortia staff met the standards only 76% of the time. “Request for Instruction” questions were answered with 97% success by local library staff and with 84% success by consortia. Local library staff met the “Request for Academic Information” standards 90% of the time while consortia staff met these standards 87% of the time. For “Miscellaneous Non‐Library Information” questions, 93% of local and 83% of consortia staff met the reference transaction standards. For the second part of the study, 89% of local library staff answered the questions in real time, as opposed to only 69% of non‐local staff. The three most common reasons for not answering in real time (known as deferment categories included not knowing the answer (48% local; 40% consortia, technical difficulty (26% local; 16% consortia, and information not being available (15% local; 31% consortia.Conclusion – The results of this research reveal that there are differences in the quality of answers between local and non‐local staff when taking part in an online chat reference consortium, although these discrepancies vary depending on the type of question. Providing non‐local librarians with the information they need to answer questions accurately and in real time can mitigate these differences.
Powers, Christina M.; Grieger, Khara D.; Hendren, Christine Ogilvie; Meacham, Connie A.; Gurevich, Gerald; Lassiter, Meredith Gooding; Money, Eric S.; Lloyd, Jennifer M.; Beaulieu, Stephen M.
Prioritizing and assessing risks associated with chemicals, industrial materials, or emerging technologies is a complex problem that benefits from the involvement of multiple stakeholder groups. For example, in the case of engineered nanomaterials (ENMs), scientific uncertainties exist that hamper environmental, health, and safety (EHS) assessments. Therefore, alternative approaches to standard EHS assessment methods have gained increased attention. The objective of this paper is to describe the application of a web-based, interactive decision support tool developed by the U.S. Environmental Protection Agency (U.S. EPA) in a pilot study on ENMs. The piloted tool implements U.S. EPA's comprehensive environmental assessment (CEA) approach to prioritize research gaps. When pursued, such research priorities can result in data that subsequently improve the scientific robustness of risk assessments and inform future risk management decisions. Pilot results suggest that the tool was useful in facilitating multi-stakeholder prioritization of research gaps. Results also provide potential improvements for subsequent applications. The outcomes of future CEAWeb applications with larger stakeholder groups may inform the development of funding opportunities for emerging materials across the scientific community (e.g., National Science Foundation Science to Achieve Results [STAR] grants, National Institutes of Health Requests for Proposals). - Highlights: • A web-based, interactive decision support tool was piloted for emerging materials. • The tool (CEAWeb) was based on an established approach to prioritize research gaps. • CEAWeb facilitates multi-stakeholder prioritization of research gaps. • We provide recommendations for future versions and applications of CEAWeb
Adeline Phaik Harn Chua; Kenneth R. Deans; Craig M. Parker
Blogs appear to be gaining momentum as a marketing tool which can be used by organisations for such strategies and processes as branding, managing reputation, developing customer trust and loyalty, niche marketing, gathering marketing intelligence and promoting their online presence. There has been limited academic research in this area, and most significantly concerning the types of small and medium enterprises (SMEs) for which blogs might have potential as a marketing tool. In an attempt to...
Eloranta, E. W.; Spuler, S.; Hayman, M. M.
Many aspects of air quality research require information on the vertical distribution of pollution. Traditional measurements, obtained from surface based samplers, or passive satellite remote sensing, do not provide vertical profiles. Lidar can provide profiles of aerosol properties. However traditional backscatter lidar suffers from uncertain calibrations with poorly constrained algorithms. These problems are avoided using High Spectral Resolution Lidar (HSRL) which provides absolutely calibrated vertical profiles of aerosol properties. The University of Wisconsin HSRL systems measure 532 nm wavelength aerosol backscatter cross-sections, extinction cross-sections, depolarization, and attenuated 1064 nm backscatter. These instruments are designed for long-term deployment at remote sites with minimal local support. Processed data is provided for public viewing and download in real-time on our web site "http://hsrl.ssec.wisc.edu". Air pollution applications of HSRL data will be illustrated with examples acquired during air quality field programs including; KORUS-AQ, DISCOVER-AQ, LAMOS and FRAPPE. Observations include 1) long range transport of dust, air pollution and smoke. 2) Fumigation episodes where elevated pollution is mixed down to the surface. 3) visibility restrictions by aerosols and 4) diurnal variations in atmospheric optical depth. While HSRL is powerful air quality research tool, its application in routine measurement networks is hindered by the high cost of current systems. Recent technical advances promise a next generation HSRL using telcom components to greatly reduce system cost. This paper will present data generated by a prototype low cost system constructed at NCAR. In addition to lower cost, operation at a non-visible near 780 nm infrared wavelength removes all FAA restrictions on the operation.
Engholm, Gerda; Ferlay, Jacques; Christensen, Niels; Bray, Freddie; Gjerstorff, Marianne L; Klint, Asa; Køtlum, Jóanis E; Olafsdóttir, Elínborg; Pukkala, Eero; Storm, Hans H
The NORDCAN database and program ( www.ancr.nu ) include detailed information and results on cancer incidence, mortality and prevalence in each of the Nordic countries over five decades and has lately been supplemented with predictions of cancer incidence and mortality; future extensions include the incorporation of cancer survival estimates. The data originates from the national cancer registries and causes of death registries in Denmark, Finland, Iceland, Norway, Sweden, and Faroe Islands and is regularly updated. Presently 41 cancer entities are included in the common dataset, and conversions of the original national data according to international rules ensure comparability. With 25 million inhabitants in the Nordic countries, 130 000 incident cancers are reported yearly, alongside nearly 60 000 cancer deaths, with almost a million persons living with a cancer diagnosis. This web-based application is available in English and in each of the five Nordic national languages. It includes comprehensive and easy-to-use descriptive epidemiology tools that provide tabulations and graphs, with further user-specified options available. The NORDCAN database aims to provide comparable and timely data to serve the varying needs of policy makers, cancer societies, the public, and journalists, as well as the clinical and research community.
Full Text Available We introduce the notion of Electric Field Encephalography (EFEG based on measuring electric fields of the brain and demonstrate, using computer modeling, that given the appropriate electric field sensors this technique may have significant advantages over the current EEG technique. Unlike EEG, EFEG can be used to measure brain activity in a contactless and reference-free manner at significant distances from the head surface. Principal component analysis using simulated cortical sources demonstrated that electric field sensors positioned 3 cm away from the scalp and characterized by the same signal-to-noise ratio as EEG sensors provided the same number of uncorrelated signals as scalp EEG. When positioned on the scalp, EFEG sensors provided 2-3 times more uncorrelated signals. This significant increase in the number of uncorrelated signals can be used for more accurate assessment of brain states for non-invasive brain-computer interfaces and neurofeedback applications. It also may lead to major improvements in source localization precision. Source localization simulations for the spherical and Boundary Element Method (BEM head models demonstrated that the localization errors are reduced two-fold when using electric fields instead of electric potentials. We have identified several techniques that could be adapted for the measurement of the electric field vector required for EFEG and anticipate that this study will stimulate new experimental approaches to utilize this new tool for functional brain research.
North, M. J. N.
Argonne National Laboratory (ANL) has worked closely with Western Area Power Administration (Western) over many years to develop a variety of electric power marketing and transmission system models that are being used for ongoing system planning and operation as well as analytic studies. Western markets and delivers reliable, cost-based electric power from 56 power plants to millions of consumers in 15 states. The Spot Market Agent Research Tool Version 2.0 (SMART II) is an investigative system that partially implements some important components of several existing ANL linear programming models, including some used by Western. SMART II does not implement a complete model of the Western utility system but it does include several salient features of this network for exploratory purposes. SMART II uses a Swarm agent-based framework. SMART II agents model bulk electric power transaction dynamics with recognition for marginal costs as well as transmission and generation constraints. SMART II uses a sparse graph of nodes and links to model the electric power spot market. The nodes represent power generators and consumers with distinct marginal decision curves and varying investment capital as well individual learning parameters. The links represent transmission lines with individual capacities taken from a range of central distribution, outlying distribution and feeder line types. The application of SMART II to electric power systems studies has produced useful results different from those often found using more traditional techniques. Use of the advanced features offered by the Swarm modeling environment simplified the creation of the SMART II model.
Vernardos, G.; Fluke, C. J.; Croton, D.; Bate, N. F.
As synoptic all-sky surveys begin to discover new multiply lensed quasars, the flow of data will enable statistical cosmological microlensing studies of sufficient size to constrain quasar accretion disk and supermassive black hole properties. In preparation for this new era, we are undertaking the GPU-Enabled, High Resolution cosmological MicroLensing parameter survey (GERLUMPH). We present here the GERLUMPH Data Release 1, which consists of 12,342 high resolution cosmological microlensing magnification maps and provides the first uniform coverage of the convergence, shear, and smooth matter fraction parameter space. We use these maps to perform a comprehensive numerical investigation of the mass-sheet degeneracy, finding excellent agreement with its predictions. We study the effect of smooth matter on microlensing induced magnification fluctuations. In particular, in the minima and saddle-point regions, fluctuations are enhanced only along the critical line, while in the maxima region they are always enhanced for high smooth matter fractions (≈0.9). We describe our approach to data management, including the use of an SQL database with a Web interface for data access and online analysis, obviating the need for individuals to download large volumes of data. In combination with existing observational databases and online applications, the GERLUMPH archive represents a fundamental component of a new microlensing eResearch cloud. Our maps and tools are publicly available at http://gerlumph.swin.edu.au/
Vernardos, G.; Fluke, C. J.; Croton, D. [Centre for Astrophysics and Supercomputing, Swinburne University of Technology, P.O. Box 218, Hawthorn, Victoria, 3122 (Australia); Bate, N. F. [Sydney Institute for Astronomy, School of Physics, A28, University of Sydney, NSW, 2006 (Australia)
As synoptic all-sky surveys begin to discover new multiply lensed quasars, the flow of data will enable statistical cosmological microlensing studies of sufficient size to constrain quasar accretion disk and supermassive black hole properties. In preparation for this new era, we are undertaking the GPU-Enabled, High Resolution cosmological MicroLensing parameter survey (GERLUMPH). We present here the GERLUMPH Data Release 1, which consists of 12,342 high resolution cosmological microlensing magnification maps and provides the first uniform coverage of the convergence, shear, and smooth matter fraction parameter space. We use these maps to perform a comprehensive numerical investigation of the mass-sheet degeneracy, finding excellent agreement with its predictions. We study the effect of smooth matter on microlensing induced magnification fluctuations. In particular, in the minima and saddle-point regions, fluctuations are enhanced only along the critical line, while in the maxima region they are always enhanced for high smooth matter fractions (≈0.9). We describe our approach to data management, including the use of an SQL database with a Web interface for data access and online analysis, obviating the need for individuals to download large volumes of data. In combination with existing observational databases and online applications, the GERLUMPH archive represents a fundamental component of a new microlensing eResearch cloud. Our maps and tools are publicly available at http://gerlumph.swin.edu.au/.
McMahan, Tracy A.; Shea, Charlotte A.; Finckenor, Miria; Ferguson, Dale
As NASA plans and implements the Vision for Space Exploration, managers, engineers, and scientists need lunar environment information that is readily available and easily accessed. For this effort, lunar environment data was compiled from a variety of missions from Apollo to more recent remote sensing missions, such as Clementine. This valuable information comes not only in the form of measurements and images but also from the observations of astronauts who have visited the Moon and people who have designed spacecraft for lunar missions. To provide a research tool that makes the voluminous lunar data more accessible, the Space Environments and Effects (SEE) Program, managed at NASA's Marshall Space Flight Center (MSFC) in Huntsville, AL, organized the data into a DVD knowledgebase: the Lunar e-Library. This searchable collection of 1100 electronic (.PDF) documents and abstracts makes it easy to find critical technical data and lessons learned from past lunar missions and exploration studies. The SEE Program began distributing the Lunar e-Library DVD in 2006. This paper describes the Lunar e-Library development process (including a description of the databases and resources used to acquire the documents) and the contents of the DVD product, demonstrates its usefulness with focused searches, and provides information on how to obtain this free resource.
King, Stephanie L
Over the years, playback experiments have helped further our understanding of the wonderful world of animal communication. They have provided fundamental insights into animal behaviour and the function of communicative signals in numerous taxa. As important as these experiments are, however, there is strong evidence to suggest that the information conveyed in a signal may only have value when presented interactively. By their very nature, signalling exchanges are interactive and therefore, an interactive playback design is a powerful tool for examining the function of such exchanges. While researchers working on frog and songbird vocal interactions have long championed interactive playback, it remains surprisingly underused across other taxa. The interactive playback approach is not limited to studies of acoustic signalling, but can be applied to other sensory modalities, including visual, chemical and electrical communication. Here, I discuss interactive playback as a potent yet underused technique in the field of animal behaviour. I present a concise review of studies that have used interactive playback thus far, describe how it can be applied, and discuss its limitations and challenges. My hope is that this review will result in more scientists applying this innovative technique to their own study subjects, as a means of furthering our understanding of the function of signalling interactions in animal communication systems. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Weber, Griffin M; Murphy, Shawn N; McMurry, Andrew J; Macfadden, Douglas; Nigrin, Daniel J; Churchill, Susanne; Kohane, Isaac S
The authors developed a prototype Shared Health Research Information Network (SHRINE) to identify the technical, regulatory, and political challenges of creating a federated query tool for clinical data repositories. Separate Institutional Review Boards (IRBs) at Harvard's three largest affiliated health centers approved use of their data, and the Harvard Medical School IRB approved building a Query Aggregator Interface that can simultaneously send queries to each hospital and display aggregate counts of the number of matching patients. Our experience creating three local repositories using the open source Informatics for Integrating Biology and the Bedside (i2b2) platform can be used as a road map for other institutions. The authors are actively working with the IRBs and regulatory groups to develop procedures that will ultimately allow investigators to obtain identified patient data and biomaterials through SHRINE. This will guide us in creating a future technical architecture that is scalable to a national level, compliant with ethical guidelines, and protective of the interests of the participating hospitals.
Full Text Available Hypersensitivity to external sounds is often comorbid with tinnitus and may be significant for adherence to certain types of tinnitus management. Therefore, a clear measure of sensitivity to sound is important. The aim of this study was to evaluate the validity and reliability of the Hyperacusis Questionnaire (HQ for use as a measurement tool using data from a sample of 264 adults who took part in tinnitus research. We evaluated the HQ factor structure, internal consistency, convergent and discriminant validity, and floor and ceiling effects. Internal consistency was high (Cronbach’s alpha = 0.88 and moderate correlations were observed between the HQ, uncomfortable loudness levels, and other health questionnaires. Confirmatory factor analysis revealed that the original HQ three-factor solution and a one-factor solution were both a poor fit to the data. Four problematic items were removed and exploratory factor analysis identified a two-factor (attentional and social solution. The original three-factor structure of the HQ was not confirmed. All fourteen items do not accurately assess hypersensitivity to sound in a tinnitus population. We propose a 10-item (2-factor version of the HQ, which will need to be confirmed using a new tinnitus and perhaps nontinnitus population.
Sharma, Deepak; Priyadarshini, Pragya; Vrati, Sudhanshu
The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain--viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Fackrell, Kathryn; Fearnley, Constance; Hoare, Derek J; Sereda, Magdalena
Hypersensitivity to external sounds is often comorbid with tinnitus and may be significant for adherence to certain types of tinnitus management. Therefore, a clear measure of sensitivity to sound is important. The aim of this study was to evaluate the validity and reliability of the Hyperacusis Questionnaire (HQ) for use as a measurement tool using data from a sample of 264 adults who took part in tinnitus research. We evaluated the HQ factor structure, internal consistency, convergent and discriminant validity, and floor and ceiling effects. Internal consistency was high (Cronbach's alpha = 0.88) and moderate correlations were observed between the HQ, uncomfortable loudness levels, and other health questionnaires. Confirmatory factor analysis revealed that the original HQ three-factor solution and a one-factor solution were both a poor fit to the data. Four problematic items were removed and exploratory factor analysis identified a two-factor (attentional and social) solution. The original three-factor structure of the HQ was not confirmed. All fourteen items do not accurately assess hypersensitivity to sound in a tinnitus population. We propose a 10-item (2-factor) version of the HQ, which will need to be confirmed using a new tinnitus and perhaps nontinnitus population.
Full Text Available In the article were described issues associated with the use by scientific institutions content marketing strategy tools. This article shows the extent to which tools of modern marketing are used in the Internet communication by scientific institutions. Currently content marketing concept is accepted not only as a fashionable trend of modern marketing but above all, it is treated as an important tool to improve enough Internet message, to effectively interest to the users. A optimal selection and use content marketing tools it provides opportunities for enhancing efficiency in the reception (acceptance of the generated message.
Full Text Available Introduction: The study objective was to determine the accuracy of answers to clinical questions by emergency medicine (EM residents conducting Internet searches by using Google. Emergency physicians commonly turn to outside resources to answer clinical questions that arise in the emergency department (ED. Internet access in the ED has supplanted textbooks for references because it is perceived as being more up to date. Although Google is the most widely used general Internet search engine, it is not medically oriented and merely provides links to other sources. Users must judge the reliability of the information obtained on the links. We frequently observed EM faculty and residents using Google rather than medicine-specific databases to seek answers to clinical questions. Methods: Two EM faculties developed a clinically oriented test for residents to take without the use of any outside aid. They were instructed to answer each question only if they were confident enough of their answer to implement it in a patient-care situation. Questions marked as unsure or answered incorrectly were used to construct a second test for each subject. On the second test, they were instructed to use Google as a resource to find links that contained answers. Results: Thirty-three residents participated. The means for the initial test were 32% correct, 28% incorrect, and 40% unsure. On the Google test, the mean for correct answers was 59%; 33% of answers were incorrect and 8% were unsure. Conclusion: EM residents’ ability to answer clinical questions correctly by using Web sites from Google searches was poor. More concerning was that unsure answers decreased, whereas incorrect answers increased. The Internet appears to have given the residents a false sense of security in their answers. Innovations, such as Internet access in the ED, should be studied carefully before being accepted as reliable tools for teaching clinical decision making. [West J Emerg Med. 2011
Shahriar, Md Sumon; de Souza, Paulo; Timms, Greg
We review existing query answering systems for sensor data. We then propose an extended query answering approach termed smart query, specifically for marine sensor data. The smart query answering system integrates pattern queries and continuous queries. The proposed smart query system considers both streaming data and historical data from marine sensor networks. The smart query also uses query relaxation technique and semantics from domain knowledge as a recommender system. The proposed smart query benefits in building data and information systems for marine sensor networks.
This book contains answers to all exercises featured in the accompanying textbook Mathematics for Common Entrance Three (Extension) , which provides essential preparation for Level 3 of the ISEB 13+ Mathematics exam, as well as for CASE and other scholarship exams. - Clean, clear layout for easy marking. - Includes examples of high-scoring answers with diagrams and workings. Also available to purchase from the Galore Park website www.galorepark.co.uk :. - Mathematics for Common Entrance Three (Extension). - Mathematics for Common Entrance One. - Mathematics for Common Entrance One Answers. - M
Paulo de Souza
Full Text Available We review existing query answering systems for sensor data. We then propose an extended query answering approach termed smart query, specifically for marine sensor data. The smart query answering system integrates pattern queries and continuous queries. The proposed smart query system considers both streaming data and historical data from marine sensor networks. The smart query also uses query relaxation technique and semantics from domain knowledge as a recommender system. The proposed smart query benefits in building data and information systems for marine sensor networks.
Full Text Available Buildings need to be more environmentally benign since the building sector is responsible for about 40% of all of energy and material use in Sweden. For this reason a unique cooperation between companies, municipalities and the Government called “Building- Living and Property Management for the future”, in short “The Building Living Dialogue” has going on since 2003. The project focuses on: a healthy indoor environment, b efficient use of energy, and c efficient resource management. In accordance with the dialogue targets, two research projects were initiated aiming at developing an Environmental rating tool taking into accounts both building sector requirements and expectations and national and international research findings. This paper describes the first phase in the development work where stakeholders and researchers cooperate. It includes results from inventories and based on this experience discusses procedures for developing assessment tools and what the desirable features of a broadly accepted building rating tool could be.
Full Text Available This article disseminates the results of a programme of detailed archaeological survey and archive research on one of Europe's most important surviving late-medieval Guild Chapels — that of the Holy Cross Guild, Stratford-upon-Avon (Warwickshire. Today the building is part of Stratford-upon-Avon's tourist trail, located directly opposite William Shakespeare's home, 'New Place', and visited by thousands of tourists every year. However, its archaeological and historical significance has been overlooked owing to the extensive restoration of the building in the 19th and 20th centuries. This destroyed evidence for an internationally significant scheme of wall paintings within the Chapel, paid for by the London Mayor and Stratford-upon-Avon merchant, Hugh Clopton, an important member of the Holy Cross Guild and the original builder of 'New Place'. The paintings also have an important connection with Stratford-upon-Avon's most famous son, William Shakespeare, whose father may have been involved in their destruction and removal during the 16th century. Research by a team of historical archaeologists and digital heritage specialists at the Department of Archaeology, University of York, has revealed the significance of the Guild Chapel through the creation of a digital model and textual paradata, which form the focus of this article. The project is ground-breaking in that it moves beyond the traditional use of digital models as virtual reconstructions of past buildings to use the model itself as a research tool through which the user can explore and validate the evidence for the scheme directly. This is achieved through the creation of a palimpsest of antiquarian drawings of the paintings, made as they were revealed during restoration works in the 19th and 20th centuries, and set within their 3-dimensional architectural context. The model allows the user to compare and contrast differences in the recording methods, iconographies and interpretations of
Grisham, William; Schottler, Natalie A.; Valli-Marill, Joanne; Beck, Lisa; Beatty, Jackson
This completely computer-based module's purpose is to introduce students to bioinformatics resources. We present an easy-to-adopt module that weaves together several important bioinformatic tools so students can grasp how these tools are used in answering research questions. Students integrate information gathered from websites dealing with…
In addition to outlining the present status of safety performance assessment in the disposal of high-level spent nuclear fuel (for example, the criteria considered, the models used, and the scenarios predicted, ect.) the seminar was structured around future safety predictions within realistic geological time references, namely, the periods prior to, during, and subsequent to the next ice-age. The importance was underlined in understanding the drastic changes in climate that probably await us in Scandinavia, how these will influence a repository located deep in the bedrock and, in turn, the reliability of the engineered and natural barriers selected to safeguard future generations from biospheric contamination. Several questions were posed, such as, will the copper canisters serve their function, or will the formation of new, or the reactivation of older faults during glacial epochs, jeopardise our safety performance predictions, or will man himself impose the greatest safety threat? This condensed summary reflects some of the main opinions expressed and inferred during the seminar, both the positive and negative aspects, in the hope of representing a balanced overview of the present status of research concerning the final disposal of spent nuclear fuel. At the beginning of each section a list of the main contributors is given and, where appropriate, individuals are referenced in the text. Because of the condensed nature of the summary, it is inevitable that some participants may have been underrepresented or quoted out of context. Any misrepresentation or misinterpretation of data discussed at the seminar is the responsibility of the translator. (au) (see INIS 22:44587 for the original report)
Henry, Nancy L.
Technology and a variety of resources play an important role in students' educational lives. Vygotsky's (1987) theory of tool mediation suggests that cultural tools, such as computer software influence individuals' thinking and action. However, it is not completely understood how technology and other resources influence student action. Middle…
... do if they have been exposed to unprotected sex but do not wish to become pregnant because ... A's Zika virus and complications » Zika digital timeline Video Zika virus - Questions and answers (Q&A) Related ...
Taking an image and question as the input of our method, it can output the text-based answer of the query question about the given image, so called Visual Question Answering (VQA). There are two main modules in our algorithm. Given a natural language question about an image, the first module takes the question as input and then outputs the basic questions of the main given question. The second module takes the main question, image and these basic questions as input and then outputs the text-based answer of the main question. We formulate the basic questions generation problem as a LASSO optimization problem, and also propose a criterion about how to exploit these basic questions to help answer main question. Our method is evaluated on the challenging VQA dataset and yields state-of-the-art accuracy, 60.34% in open-ended task.
Pollack, Martha E
The importance of plan inference in models of conversation has been widely noted in the computational-linguistics literature, and its incorporation in question-answering systems has enabled a range...
Shilgalis, Thomas W.
A number of questions are posed that can be answered with the aid of calculus. These include best value problems, best shape problems, problems involving integration, and growth and decay problems. (MP)
Huang, Jia-Hong; Alfadly, Modar; Ghanem, Bernard
Taking an image and question as the input of our method, it can output the text-based answer of the query question about the given image, so called Visual Question Answering (VQA). There are two main modules in our algorithm. Given a natural language question about an image, the first module takes the question as input and then outputs the basic questions of the main given question. The second module takes the main question, image and these basic questions as input and then outputs the text-based answer of the main question. We formulate the basic questions generation problem as a LASSO optimization problem, and also propose a criterion about how to exploit these basic questions to help answer main question. Our method is evaluated on the challenging VQA dataset and yields state-of-the-art accuracy, 60.34% in open-ended task.
Ewen, K.; Hoppe, G.
An easily surveyable catalogue of questions is presented which is to make it easier for medical personnel to get acquainted with the basic knowledge according to the X-ray Ordinance and to acquire the expert knowledge in radiation protection. The catalogue is arranged according to different subjects. There are several alternative answers to every question. The right answer is given in the solution index (annex). (HP) [de
Smith, Des H.V.; Moehrenschlager, Axel; Christensen, Nancy; Knapik, Dwight; Gibson, Keith; Converse, Sarah J.
Worldwide, approximately 168 bird species are captive-bred for reintroduction into the wild. Programs tend to be initiated for species with a high level of endangerment. Depressed hatching success can be a problem for such programs and has been linked to artificial incubation. The need for artificial incubation is driven by the practice of multiclutching to increase egg production or by uncertainty over the incubation abilities of captive birds. There has been little attempt to determine how artificial incubation differs from bird-contact incubation. We describe a novel archive (data-logger) egg and use it to compare temperature, humidity, and egg-turning in 5 whooping crane (Grus americana) nests, 4 sandhill crane (G. canadensis) nests, and 3 models of artificial incubator; each of which are used to incubate eggs in whooping crane captive-breeding programs. Mean incubation temperature was 31.7° C for whooping cranes and 32.83° C for sandhill cranes. This is well below that of the artificial incubators (which were set based on a protocol of 37.6° C). Humidity in crane nests varied considerably, but median humidity in all 3 artificial incubators was substantially different from that in the crane nests. Two artificial incubators failed to turn the eggs in a way that mimicked crane egg-turning. Archive eggs are an effective tool for guiding the management of avian conservation breeding programs, and can be custom-made for other species. They also have potential to be applied to research on wild populations.
Benis, Arriel; Hoshen, Moshe
Outcomes research and evidence-based medical practice is being positively impacted by proliferation of healthcare databases. Modern epidemiologic studies require complex data comprehension. A new tool, DisEpi, facilitates visual exploration of epidemiological data supporting Public Health Knowledge Discovery. It provides domain-experts a compact visualization of information at the population level. In this study, DisEpi is applied to Attention-Deficit/Hyperactivity Disorder (ADHD) patients within Clalit Health Services, analyzing the socio-demographic and ADHD filled prescription data between 2006 and 2016 of 1,605,800 children aged 6 to 17 years. DisEpi's goals facilitate the identification of (1) Links between attributes and/or events, (2) Changes in these relationships over time, and (3) Clusters of population attributes for similar trends. DisEpi combines hierarchical clustering graphics and a heatmap where color shades reflect disease time-trends. In the ADHD context, DisEpi allowed the domain-expert to visually analyze a snapshot summary of data mining results. Accordingly, the domain-expert was able to efficiently identify that: (1) Relatively younger children and particularly youngest children in class are treated more often, (2) Medication incidence increased between 2006 and 2011 but then stabilized, and (3) Progression rates of medication incidence is different for each of the 3 main discovered clusters (aka: profiles) of treated children. DisEpi delivered results similar to those previously published which used classical statistical approaches. DisEpi requires minimal preparation and fewer iterations, generating results in a user-friendly format for the domain-expert. DisEpi will be wrapped as a package containing the end-to-end discovery process. Optionally, it may provide automated annotation using calendar events (such as policy changes or media interests), which can improve discovery efficiency, interpretation, and policy implementation.
Full Text Available Purpose: The aim of this study is to investigate the importance of Knowledge Management as a tool for improving business processes in a different context from the industrial organizations, as an archaeological museum. Design/methodology/approach: Using data collected from the National Museum of the Sultanate of Oman in Muscat, a methodology for analysis and improvement of processes (the Business Cycle Management Process, CMP is designed and validated. This application is described as an eight phases process based on Six Sigma DMAIC. The model has a characteristic "P" shape. Findings: As the results obtained by the process improvement initiative show, we highlight the relevance of the improvement in all aspects regarding the security in showcases in that context. Research limitations/implications: The complexity of implementing indicators and the partial vision of the project as data were only obtained from a part of one of the companies involved in the construction of the museum. An important implication of this paper is in order to present a methodology to improve the museum processes focusing on the reduction of errors and also adding value for the visitors. Practical implications: The relevance to intervene on certain relevant variables at different levels of management performance is verified. Social implications: Improving the quality of leisure services in order to the identification of certain challenges regarding the nature and competitiveness of cultural services. Originality/value: The current work has served as a repository of knowledge applicable to new similar projects, in which to take into account the peculiarities of each case and in particular the level of quality demanded by the client in a cultural context. It is important to take into account the degree of avoidable dissatisfaction (number of solvable problems that would lead to dissatisfaction, the opportunity for improvement, the reduction of operational waste and the need
Vitova, T.; Brendebach, B.; Dardenne, K.; Denecke, M. A.; Lebid, A.; Löble, M.; Rothe, J.; Batuk, O. N.; Hormes, J.; Liu, D.; Breher, F.; Geckeis, H.
High resolution X-ray emission spectroscopy (HRXES) is becoming increasingly important for our understanding of electronic and coordination structures. The combination of such information with development of quantum theoretical tools will advance our capability for predicting reactivity and physical behavior especially of 5f elements. HRXES can be used to remove lifetime broadening by registering the partial fluorescence yield emitted by the sample (i.e., recording a windowed signal from the energy dispersed fluorescence emission while varying incident photon energy), thereby yielding highly resolved X-ray absorption fine structure (XAFS) spectra. Such spectra often display resonant features not observed in conventional XAFS. The spectrometer set-up can also be used for a wide range of other experiments, for example, resonant inelastic X-ray scattering (RIXS), where bulk electron configuration information in solids, liquids and gases is obtained. Valence-selective XAFS studies, where the local structure of a selected element's valence state present in a mixture of valence states can be obtained, as well as site-selective XAFS studies, where the coordination structure of a metal bound to selected elements can be differentiated from that of all the other ligating atoms. A HRXES spectrometer has been constructed and is presently being commissioned for use at the INE-Beamline for actinide research at the synchrotron source ANKA at FZK. We present the spectrometer's compact, modular design, optimized for attaining a wide range of energies, and first test measurement results. Examples from HRXES studies of lanthanides, actinides counter parts, are also shown.
Mendoza, A. M. M.; Rastaetter, L.; Kuznetsova, M. M.; Mays, M. L.; Chulaki, A.; Shim, J. S.; MacNeice, P. J.; Taktakishvili, A.; Collado-Vega, Y. M.; Weigand, C.; Zheng, Y.; Mullinix, R.; Patel, K.; Pembroke, A. D.; Pulkkinen, A. A.; Boblitt, J. M.; Bakshi, S. S.; Tsui, T.
The Community Coordinated Modeling Center (CCMC), with the fundamental goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research, has been serving as an integral hub for over 15 years, providing invaluable resources to both space weather scientific and operational communities. CCMC has developed and provided innovative web-based point of access tools varying from: Runs-On-Request System - providing unprecedented global access to the largest collection of state-of-the-art solar and space physics models, Integrated Space Weather Analysis (iSWA) - a powerful dissemination system for space weather information, Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and Mobile apps to view space weather data anywhere to the scientific community. In addition to supporting research and performing model evaluations, CCMC also supports space science education by hosting summer students through local universities. In this poster, we will showcase CCMC's latest innovative tools and services, and CCMC's tools that revolutionized the way we do research and improve our operational space weather capabilities. CCMC's free tools and resources are all publicly available online (http://ccmc.gsfc.nasa.gov).
Duffy, Christopher; Leonard, Lorne; Shi, Yuning; Bhatt, Gopal; Hanson, Paul; Gil, Yolanda; Yu, Xuan
Using a series of recent examples and papers we explore some progress and potential for virtual (cyber-) collaboration inspired by access to high resolution, harmonized public-sector data at continental scales . The first example describes 7 meso-scale catchments in Pennsylvania, USA where the watershed is forced by climate reanalysis and IPCC future climate scenarios (Intergovernmental Panel on Climate Change). We show how existing public-sector data and community models are currently able to resolve fine-scale eco-hydrologic processes regarding wetland response to climate change . The results reveal that regional climate change is only part of the story, with large variations in flood and drought response associated with differences in terrain, physiography, landuse and/or hydrogeology. The importance of community-driven virtual testbeds are demonstrated in the context of Critical Zone Observatories, where earth scientists from around the world are organizing hydro-geophysical data and model results to explore new processes that couple hydrologic models with land-atmosphere interaction, biogeochemical weathering, carbon-nitrogen cycle, landscape evolution and ecosystem services . Critical Zone cyber-research demonstrates how data-driven model development requires a flexible computational structure where process modules are relatively easy to incorporate and where new data structures can be implemented . From the perspective of "Big-Data" the paper points out that extrapolating results from virtual observatories to catchments at continental scales, will require centralized or cloud-based cyberinfrastructure as a necessary condition for effectively sharing petabytes of data and model results . Finally we outline how innovative cyber-science is supporting earth-science learning, sharing and exploration through the use of on-line tools where hydrologists and limnologists are sharing data and models for simulating the coupled impacts of catchment
Siegel Robert S
Full Text Available Abstract Background A common limitation in guard cell signaling research is that it is difficult to obtain consistent high expression of transgenes of interest in Arabidopsis guard cells using known guard cell promoters or the constitutive 35S cauliflower mosaic virus promoter. An additional drawback of the 35S promoter is that ectopically expressing a gene throughout the organism could cause pleiotropic effects. To improve available methods for targeted gene expression in guard cells, we isolated strong guard cell promoter candidates based on new guard cell-specific microarray analyses of 23,000 genes that are made available together with this report. Results A promoter, pGC1(At1g22690, drove strong and relatively specific reporter gene expression in guard cells including GUS (beta-glucuronidase and yellow cameleon YC3.60 (GFP-based calcium FRET reporter. Reporter gene expression was weaker in immature guard cells. The expression of YC3.60 was sufficiently strong to image intracellular Ca2+ dynamics in guard cells of intact plants and resolved spontaneous calcium transients in guard cells. The GC1 promoter also mediated strong reporter expression in clustered stomata in the stomatal development mutant too-many-mouths (tmm. Furthermore, the same promoter::reporter constructs also drove guard cell specific reporter expression in tobacco, illustrating the potential of this promoter as a method for high level expression in guard cells. A serial deletion of the promoter defined a guard cell expression promoter region. In addition, anti-sense repression using pGC1 was powerful for reducing specific GFP gene expression in guard cells while expression in leaf epidermal cells was not repressed, demonstrating strong cell-type preferential gene repression. Conclusion The pGC1 promoter described here drives strong reporter expression in guard cells of Arabidopsis and tobacco plants. It provides a potent research tool for targeted guard cell expression or
Yang, Yingzhen; Costa, Alex; Leonhardt, Nathalie; Siegel, Robert S; Schroeder, Julian I
Background A common limitation in guard cell signaling research is that it is difficult to obtain consistent high expression of transgenes of interest in Arabidopsis guard cells using known guard cell promoters or the constitutive 35S cauliflower mosaic virus promoter. An additional drawback of the 35S promoter is that ectopically expressing a gene throughout the organism could cause pleiotropic effects. To improve available methods for targeted gene expression in guard cells, we isolated strong guard cell promoter candidates based on new guard cell-specific microarray analyses of 23,000 genes that are made available together with this report. Results A promoter, pGC1(At1g22690), drove strong and relatively specific reporter gene expression in guard cells including GUS (beta-glucuronidase) and yellow cameleon YC3.60 (GFP-based calcium FRET reporter). Reporter gene expression was weaker in immature guard cells. The expression of YC3.60 was sufficiently strong to image intracellular Ca2+ dynamics in guard cells of intact plants and resolved spontaneous calcium transients in guard cells. The GC1 promoter also mediated strong reporter expression in clustered stomata in the stomatal development mutant too-many-mouths (tmm). Furthermore, the same promoter::reporter constructs also drove guard cell specific reporter expression in tobacco, illustrating the potential of this promoter as a method for high level expression in guard cells. A serial deletion of the promoter defined a guard cell expression promoter region. In addition, anti-sense repression using pGC1 was powerful for reducing specific GFP gene expression in guard cells while expression in leaf epidermal cells was not repressed, demonstrating strong cell-type preferential gene repression. Conclusion The pGC1 promoter described here drives strong reporter expression in guard cells of Arabidopsis and tobacco plants. It provides a potent research tool for targeted guard cell expression or gene silencing. It is also
Altman, Eric I; Baykara, Mehmet Z; Schwarz, Udo D
Although atomic force microscopy (AFM) was rapidly adopted as a routine surface imaging apparatus after its introduction in 1986, it has not been widely used in catalysis research. The reason is that common AFM operating modes do not provide the atomic resolution required to follow catalytic processes; rather the more complex noncontact (NC) mode is needed. Thus, scanning tunneling microscopy has been the principal tool for atomic scale catalysis research. In this Account, recent developments in NC-AFM will be presented that offer significant advantages for gaining a complete atomic level view of catalysis. The main advantage of NC-AFM is that the image contrast is due to the very short-range chemical forces that are of interest in catalysis. This motivated our development of 3D-AFM, a method that yields quantitative atomic resolution images of the potential energy surfaces that govern how molecules approach, stick, diffuse, and rebound from surfaces. A variation of 3D-AFM allows the determination of forces required to push atoms and molecules on surfaces, from which diffusion barriers and variations in adsorption strength may be obtained. Pushing molecules towards each other provides access to intermolecular interaction between reaction partners. Following reaction, NC-AFM with CO-terminated tips yields textbook images of intramolecular structure that can be used to identify reaction intermediates and products. Because NC-AFM and STM contrast mechanisms are distinct, combining the two methods can produce unique insight. It is demonstrated for surface-oxidized Cu(100) that simultaneous 3D-AFM/STM yields resolution of both the Cu and O atoms. Moreover, atomic defects in the Cu sublattice lead to variations in the reactivity of the neighboring O atoms. It is shown that NC-AFM also allows a straightforward imaging of work function variations which has been used to identify defect charge states on catalytic surfaces and to map charge transfer within an individual
Zhao, Y.; Zhao, Y. L.; Shao, YW; Hu, T. J.; Zhang, Q.; Ge, X. H.
Cutting force is an important factor that affects machining accuracy, cutting vibration and tool wear. Machining condition monitoring by cutting force measurement is a key technology for intelligent manufacture. Current cutting force sensors exist problems of large volume, complex structure and poor compatibility in practical application, for these problems, a smart cutting tool is proposed in this paper for cutting force measurement. Commercial MEMS (Micro-Electro-Mechanical System) strain gauges with high sensitivity and small size are adopted as transducing element of the smart tool, and a structure optimized cutting tool is fabricated for MEMS strain gauge bonding. Static calibration results show that the developed smart cutting tool is able to measure cutting forces in both X and Y directions, and the cross-interference error is within 3%. Its general accuracy is 3.35% and 3.27% in X and Y directions, and sensitivity is 0.1 mV/N, which is very suitable for measuring small cutting forces in high speed and precision machining. The smart cutting tool is portable and reliable for practical application in CNC machine tool.
Abery, Philip; Kuys, Suzanne; Lynch, Mary; Low Choy, Nancy
To design and establish reliability of a local stroke audit tool by engaging allied health clinicians within a privately funded hospital. Design: Two-stage study involving a modified Delphi process to inform stroke audit tool development and inter-tester reliability. Allied health clinicians. A modified Delphi process to select stroke guideline recommendations for inclusion in the audit tool. Reliability study: 1 allied health representative from each discipline audited 10 clinical records with sequential admissions to acute and rehabilitation services. Recommendations were admitted to the audit tool when 70% agreement was reached, with 50% set as the reserve agreement. Inter-tester reliability was determined using intra-class correlation coefficients (ICCs) across 10 clinical records. Twenty-two participants (92% female, 50% physiotherapists, 17% occupational therapists) completed the modified Delphi process. Across 6 voting rounds, 8 recommendations reached 70% agreement and 2 reached 50% agreement. Two recommendations (nutrition/hydration; goal setting) were added to ensure representation for all disciplines. Substantial consistency across raters was established for the audit tool applied in acute stroke (ICC .71; range .48 to .90) and rehabilitation (ICC.78; range .60 to .93) services. Allied health clinicians within a privately funded hospital generally agreed in an audit process to develop a reliable stroke audit tool. Allied health clinicians agreed on stroke guideline recommendations to inform a stroke audit tool. The stroke audit tool demonstrated substantial consistency supporting future use for service development. This process, which engages local clinicians, could be adopted by other facilities to design reliable audit tools to identify local service gaps to inform changes to clinical practice. © 2018 John Wiley & Sons, Ltd.
Wiche, Oliver; Székely, Balazs; Moschner, Christin; Heilmeier, Hermann
plots was randomized and every treatment was fivefold replicated. Soil solution was collected weekly with plastic suction cups. Concentrations of trace metals in shoots of oat and soil solution were measured with ICP-MS. As a result, we found that both, concentrations of trace elements in oat plants, as well as the mobility of P and trace metals in soil solution was increased by an intercropping with white lupine. Mixed culture of oat with 11% white lupin significantly increased the concentrations of the trace nutrients Fe, Mn and Zn, as well as the concentrations of the trace metals Pb, La, Nd, Sc, Th and U in tissues of oat. Surprisingly, mixed cultures with 33 % white lupin did not significantly affect trace metal concentrations in oat, what might be the consequence of an increasing competition of roots of white lupin and oat for nutrients and trace metals. In conclusion we found that mixed cultures of white lupin with cereals might be a powerful tool for enhanced phytoremediation and phytomining. However, processes involved in the physiochemical mechanism of element uptake as affected by the oat/white lupin co-cultivation remain unknown and further studies on this topic are planned. These studies have been carried out in the framework of the PhytoGerm project, financed by the Federal Ministry of Education and Research, Germany. The authors are grateful to students and laboratory assistants contributing in the field work and sample preparation.
Full Text Available In the frame of the FP7 POPDAT project the Ionosphere Waves Service (IWS has been developed and opened for public access by ionosphere experts. IWS is forming a database, derived from archived ionospheric wave records to assist the ionosphere and Space Weather research, and to answer the following questions: How can the data of earlier ionospheric missions be reprocessed with current algorithms to gain more profitable results? How could the scientific community be provided with a new insight on wave processes that take place in the ionosphere? The answer is a specific and unique data mining service accessing a collection of topical catalogs that characterize a huge number of recorded occurrences of Whistler-like Electromagnetic Wave Phenomena, Atmosphere Gravity Waves, and Traveling Ionosphere Disturbances. IWS online service (http://popdat.cbk.waw.pl offers end users to query optional set of predefined wave phenomena, their detailed characteristics. These were collected by target specific event detection algorithms in selected satellite records during database buildup phase. Result of performed wave processing thus represents useful information on statistical or comparative investigations of wave types, listed in a detailed catalog of ionospheric wave phenomena. The IWS provides wave event characteristics, extracted by specific software systems from data records of the selected satellite missions. The end-user can access targets by making specific searches and use statistical modules within the service in their field of interest. Therefore the IWS opens a new way in ionosphere and Space Weather research. The scientific applications covered by IWS concern beyond Space Weather also other fields like earthquake precursors, ionosphere climatology, geomagnetic storms, troposphere-ionosphere energy transfer, and trans-ionosphere link perturbations.
Ferencz, Csaba; Lizunov, Georgii; Crespon, François; Price, Ivan; Bankov, Ludmil; Przepiórka, Dorota; Brieß, Klaus; Dudkin, Denis; Girenko, Andrey; Korepanov, Valery; Kuzmych, Andrii; Skorokhod, Tetiana; Marinov, Pencho; Piankova, Olena; Rothkaehl, Hanna; Shtus, Tetyana; Steinbach, Péter; Lichtenberger, János; Sterenharz, Arnold; Vassileva, Any
In the frame of the FP7 POPDAT project the Ionosphere Waves Service (IWS) has been developed and opened for public access by ionosphere experts. IWS is forming a database, derived from archived ionospheric wave records to assist the ionosphere and Space Weather research, and to answer the following questions: How can the data of earlier ionospheric missions be reprocessed with current algorithms to gain more profitable results? How could the scientific community be provided with a new insight on wave processes that take place in the ionosphere? The answer is a specific and unique data mining service accessing a collection of topical catalogs that characterize a huge number of recorded occurrences of Whistler-like Electromagnetic Wave Phenomena, Atmosphere Gravity Waves, and Traveling Ionosphere Disturbances. IWS online service (http://popdat.cbk.waw.pl) offers end users to query optional set of predefined wave phenomena, their detailed characteristics. These were collected by target specific event detection algorithms in selected satellite records during database buildup phase. Result of performed wave processing thus represents useful information on statistical or comparative investigations of wave types, listed in a detailed catalog of ionospheric wave phenomena. The IWS provides wave event characteristics, extracted by specific software systems from data records of the selected satellite missions. The end-user can access targets by making specific searches and use statistical modules within the service in their field of interest. Therefore the IWS opens a new way in ionosphere and Space Weather research. The scientific applications covered by IWS concern beyond Space Weather also other fields like earthquake precursors, ionosphere climatology, geomagnetic storms, troposphere-ionosphere energy transfer, and trans-ionosphere link perturbations.
Maar, Marion; Yeates, Karen; Barron, Marcia; Hua, Diane; Liu, Peter; Moy Lum-Kwong, Margaret; Perkins, Nancy; Sleeth, Jessica; Tobe, Joshua; Wabano, Mary Jo; Williamson, Pamela; Tobe, Sheldon W
Non-communicable chronic diseases are the leading causes of mortality globally, and nearly 80% of these deaths occur in low- and middle-income countries (LMICs). In high-income countries (HICs), inequitable distribution of resources affects poorer and otherwise disadvantaged groups including Aboriginal peoples. Cardiovascular mortality in high-income countries has recently begun to fall; however, these improvements are not realized among citizens in LMICs or those subgroups in high-income countries who are disadvantaged in the social determinants of health including Aboriginal people. It is critical to develop multi-faceted, affordable and realistic health interventions in collaboration with groups who experience health inequalities. Based on community-based participatory research (CBPR), we aimed to develop implementation tools to guide complex interventions to ensure that health gains can be realized in low-resource environments. We developed the I-RREACH (Intervention and Research Readiness Engagement and Assessment of Community Health Care) tool to guide implementation of interventions in low-resource environments. We employed CBPR and a consensus methodology to (1) develop the theoretical basis of the tool and (2) to identify key implementation factor domains; then, we (3) collected participant evaluation data to validate the tool during implementation. The I-RREACH tool was successfully developed using a community-based consensus method and is rooted in participatory principles, equalizing the importance of the knowledge and perspectives of researchers and community stakeholders while encouraging respectful dialogue. The I-RREACH tool consists of three phases: fact finding, stakeholder dialogue and community member/patient dialogue. The evaluation for our first implementation of I-RREACH by participants was overwhelmingly positive, with 95% or more of participants indicating comfort with and support for the process and the dialogue it creates. The I
Harkness, L.; Mazzoleni, L. R.; Dzepina, K.; Mazzoleni, C.; China, S.
Atmospheric science and climate change are becoming increasingly important, especially in education, as the Next Generation Science Standards now include climate change. A collaborating team of research scientists and students are studying the free troposphere, specifically the aerosol composition and properties, on the island of Pico in the Azores Archipelago. The research station sits in the caldera of Mount Pico, 2225 meters above sea level. At this elevation, the station is above the marine boundary layer, thus placing it in the free troposphere. In this work, collaboration between a high school Earth Science teacher and university researchers was formed with the goal of developing classroom and outreach materials regarding atmospheric science. Among the materials, a video was created containing: site and project background, explanation of some of the instruments used and candid conversations regarding science and research. The video serves several purposes, such as informing students and the general public about what is happening in the atmosphere and informing students about the importance of science and research. The video could also be used to educate the local island community and tourists. Other materials designed include data directly obtained from the project, such as measurements of aerosol particles in electron microscopy photos (which were imaged for particle morphology and size), and composition of the aerosol particles. Students can use this evidence, as well as other data, to gain a better understanding of aerosols and the overall effect they have on the climate. Students will discover this evidence as they work through a series of experiments and activities. Using the strategy of Claim-Evidence-Reasoning as a way to answer scientific questions, students will use the evidence they gathered to explain their ideas. One such question could be, 'How do aerosols affect the climate?' and the student's 'claim' is their answer to that question. In the
JMBE Production Editor
Full Text Available Correction for Sarah E. Council and Julie E. Horvath, “Tools for Citizen-Science Recruitment and Student Engagement in Your Research and in Your Classroom,” which appeared in the Journal of Microbiology & Biology Education, volume 17, number 1, March 2016, pages 38–40.
The aim of this study is to investigate whether Lego could be used as a tool for reflective practice with social care practitioners (SCPs) and student practitioners. This article outlines an action research study conducted in an institute of higher education in Ireland. Findings from this study suggest that Lego can be used to support student…
Ng, Wan; Gunstone, Richard
Investigates the use of the World Wide Web (WWW) as a research and teaching tool in promoting self-directed learning groups of 15-year-old students. Discusses the perceptions of students of the effectiveness of the WWW in assisting them with the construction of knowledge on photosynthesis and respiration. (Contains 33 references.) (Author/YDS)
Powers, Christina M., E-mail: email@example.com [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Grieger, Khara D., E-mail: firstname.lastname@example.org [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States); Hendren, Christine Ogilvie, E-mail: email@example.com [Center for the Environmental Implications of NanoTechnology, Duke University, Durham, NC 27708 (United States); Meacham, Connie A., E-mail: firstname.lastname@example.org [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Gurevich, Gerald, E-mail: email@example.com [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Lassiter, Meredith Gooding, E-mail: firstname.lastname@example.org [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Money, Eric S., E-mail: email@example.com [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States); Lloyd, Jennifer M., E-mail: firstname.lastname@example.org [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States); Beaulieu, Stephen M., E-mail: email@example.com [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States)
Prioritizing and assessing risks associated with chemicals, industrial materials, or emerging technologies is a complex problem that benefits from the involvement of multiple stakeholder groups. For example, in the case of engineered nanomaterials (ENMs), scientific uncertainties exist that hamper environmental, health, and safety (EHS) assessments. Therefore, alternative approaches to standard EHS assessment methods have gained increased attention. The objective of this paper is to describe the application of a web-based, interactive decision support tool developed by the U.S. Environmental Protection Agency (U.S. EPA) in a pilot study on ENMs. The piloted tool implements U.S. EPA's comprehensive environmental assessment (CEA) approach to prioritize research gaps. When pursued, such research priorities can result in data that subsequently improve the scientific robustness of risk assessments and inform future risk management decisions. Pilot results suggest that the tool was useful in facilitating multi-stakeholder prioritization of research gaps. Results also provide potential improvements for subsequent applications. The outcomes of future CEAWeb applications with larger stakeholder groups may inform the development of funding opportunities for emerging materials across the scientific community (e.g., National Science Foundation Science to Achieve Results [STAR] grants, National Institutes of Health Requests for Proposals). - Highlights: • A web-based, interactive decision support tool was piloted for emerging materials. • The tool (CEAWeb) was based on an established approach to prioritize research gaps. • CEAWeb facilitates multi-stakeholder prioritization of research gaps. • We provide recommendations for future versions and applications of CEAWeb.
Mishra, P; Patankar, A; Etmektzoglou, A; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States); Lewis, J [Brigham and Women’s Hospital, Boston, MA (United States)
Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verified via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto.
Mishra, P; Patankar, A; Etmektzoglou, A; Svatos, M; Lewis, J
Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verified via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto
Good, Marjorie J; Hurley, Patricia; Woo, Kaitlin M; Szczepanek, Connie; Stewart, Teresa; Robert, Nicholas; Lyss, Alan; Gönen, Mithat; Lilenbaum, Rogerio
Clinical research program managers are regularly faced with the quandary of determining how much of a workload research staff members can manage while they balance clinical practice and still achieve clinical trial accrual goals, maintain data quality and protocol compliance, and stay within budget. A tool was developed to measure clinical trial-associated workload, to apply objective metrics toward documentation of work, and to provide clearer insight to better meet clinical research program challenges and aid in balancing staff workloads. A project was conducted to assess the feasibility and utility of using this tool in diverse research settings. Community-based research programs were recruited to collect and enter clinical trial-associated monthly workload data into a web-based tool for 6 consecutive months. Descriptive statistics were computed for self-reported program characteristics and workload data, including staff acuity scores and number of patient encounters. Fifty-one research programs that represented 30 states participated. Median staff acuity scores were highest for staff with patients enrolled in studies and receiving treatment, relative to staff with patients in follow-up status. Treatment trials typically resulted in higher median staff acuity, relative to cancer control, observational/registry, and prevention trials. Industry trials exhibited higher median staff acuity scores than trials sponsored by the National Institutes of Health/National Cancer Institute, academic institutions, or others. The results from this project demonstrate that trial-specific acuity measurement is a better measure of workload than simply counting the number of patients. The tool was shown to be feasible and useable in diverse community-based research settings. Copyright © 2016 by American Society of Clinical Oncology.
Curtis, Helen J; Goldacre, Ben
We aimed to compile and normalise England's national prescribing data for 1998-2016 to facilitate research on long-term time trends and create an open-data exploration tool for wider use. We compiled data from each individual year's national statistical publications and normalised them by mapping each drug to its current classification within the national formulary where possible. We created a freely accessible, interactive web tool to allow anyone to interact with the processed data. We downloaded all available annual prescription cost analysis datasets, which include cost and quantity for all prescription items dispensed in the community in England. Medical devices and appliances were excluded. We measured the extent of normalisation of data and aimed to produce a functioning accessible analysis tool. All data were imported successfully. 87.5% of drugs were matched exactly on name to the current formulary and a further 6.5% to similar drug names. All drugs in core clinical chapters were reconciled to their current location in the data schema, with only 1.26% of drugs not assigned a current chemical code. We created an openly accessible interactive tool to facilitate wider use of these data. Publicly available data can be made accessible through interactive online tools to help researchers and policy-makers explore time trends in prescribing. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Full Text Available Background: To the best of our knowledge, a strategic approach to define the contents of structured clinical documentation tools for both clinical routine patient care and research purposes has not been reported so far, although electronic health record will become more and more structured and detailed in the future. Objective: To achieve an interdisciplinary consensus on a checklist to be considered for the preparation of disease- and situation-specific clinical documentation tools. Methods: A 2-round Delphi consensus-based process was conducted both with 19 physicians of different disciplines and 14 students from Austria, Switzerland, and Germany. Agreement was defined as 80% or more positive votes of the participants. Results: The participants agreed that a working group should be set up for the development of structured disease- or situation-specific documentation tools (97% agreement. The final checklist included 4 recommendations concerning the setup of the working group, 12 content-related recommendations, and 3 general and technical recommendations (mean agreement [standard deviation] = 97.4% [4.0%], ranging from 84.2% to 100.0%. Discussion and Conclusion: In the future, disease- and situation-specific structured documentation tools will provide an important bridge between registries and electronic health records. Clinical documentation tools defined according to this Delphi consensus-based checklist will provide data for registries while serving as high-quality data acquisition tools in routine clinical care.
Torous, John; Kiang, Mathew V; Lorme, Jeanette; Onnela, Jukka-Pekka
Background A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-qu...
Full Text Available The term “organizational learning” raises a broad range of questions, specifically with regard to its contents. Following the thoughts of eminent philosophers, such as Aristotle and Confucius, the contribution of scientists in any research field to the corpus of human knowledge should also be based on the proper governing of the use of language. Therefore it is, first, of serious importance to be aware that organizational learning is just one dimension or element of the learning organization and not vice versa; second, a good comprehension of basic categories related to the organizational side of (formal social units’ functioning is an imperative part of organizational learning process. In writing this paper, the author started from his experiences acquired in his role as a lecturer on the subject “Theory of Organization”, in which the goal of lecturing was explained to students as gaining knowledge about cooperation and competition of people in the entities of rational production of goods. To generalize the presented questions and answers regarding the use of term “organization” in the field of management, certain similarities and comparisons were sought and found in other fields of science and, more generally, in life itself. After more detailed explanations of other relevant categories for the organizational learning process, the process itself is defined by its goals and steps where the overlapping of the learning process with the organizational change process and the process of increasing organizational capital is shown. Finally, it is also emphasized that the idea of improving internal relationships – as the substance of organization – between employees in a formal social unit through organizational learning could and should be exploited in external relationships between formal social units.
Simmons, Aaron B; Bloomsburg, Samuel J; Billingslea, Samuel A; Merrill, Morgan M; Li, Shuai; Thomas, Marshall W; Fuerst, Peter G
superior colliculus. Pou4f2(Cre) provides multiple uses for the vision researcher's genetic toolkit. First, Pou4f2(Cre) is a knock-in allele that can be used to eliminate Pou4f2, resulting in depletion of RGCs. Second, expression of Cre in male germ cells makes this strain an efficient germline activator of recombination, for example, to target LoxP-flanked sequences in the whole mouse. Third, Pou4f2(Cre) efficiently targets RGCs, amacrine cells, bipolar cells, horizontal cells, and a small number of photoreceptors within the retina, as well as the visual centers in the brain. Unlike other Cre recombinase lines that target retinal neurons, no recombination was observed in Müller or other retinal glia. These properties make this Cre recombinase line a useful tool for vision researchers.
Adeline Phaik Harn Chua
Full Text Available Blogs appear to be gaining momentum as a marketing tool which can be used by organisations for such strategies and processes as branding, managing reputation, developing customer trust and loyalty, niche marketing, gathering marketing intelligence and promoting their online presence. There has been limited academic research in this area, and most significantly concerning the types of small and medium enterprises (SMEs for which blogs might have potential as a marketing tool. In an attempt to address the knowledge gap, this paper presents a future research agenda (in the form of research questions which can guide the eBusiness research community in conducting much needed studies in this area. This paper is particularly novel in that it aims to demonstrate how the heterogeneity of SMEs and their specific business uses of eBusiness technology such as blogs can form the central plank of a future research agenda. This is important because the existing eBusiness literature tends to treat eBusiness collectively rather than focusing on the specific business uses of different eBusiness technologies, and to treat SMEs as a homogeneous group. The paper concludes with a discussion of how this research agenda can form the basis of studies which use a range of different research methods, and how this "big picture" agenda approach might help the eBusiness research community build theory which better explains SME adoption and use of eBusiness.
Shukla, Vaibhav; Varghese, Vinay Koshy; Kabekkodu, Shama Prasada; Mallya, Sandeep; Satyamoorthy, Kapaettu
Since the discovery of microRNAs (miRNAs), a class of noncoding RNAs that regulate the gene expression posttranscriptionally in sequence-specific manner, there has been a release of number of tools useful for both basic and advanced applications. This is because of the significance of miRNAs in many pathophysiological conditions including cancer. Numerous bioinformatics tools that have been developed for miRNA analysis have their utility for detection, expression, function, target prediction and many other related features. This review provides a comprehensive assessment of web-based tools for the miRNA analysis that does not require prior knowledge of any computing languages. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please email: firstname.lastname@example.org.
Campbell, A. Malcolm; Eckdahl, Todd; Cronk, Brian; Andresen, Corinne; Frederick, Paul; Huckuntod, Samantha; Shinneman, Claire; Wacker, Annie; Yuan, Jason
The "Vision and Change" report recommended genuine research experiences for undergraduate biology students. Authentic research improves science education, increases the number of scientifically literate citizens, and encourages students to pursue research. Synthetic biology is well suited for undergraduate research and is a growing area…
Charged particle activation analysis based on the bombardment with 15MeV protons from cyclotron was used to study the friction wearing at the zone of contacts in cutting tools, roller bearings and gear teeth. The radioactivity of resulting isotopes such as Co-56, Co-58, Re-183 serves as a measure of the mass changes on the surface tools. The method is suitable for studying the parameters effecting wearing processes and the role of cutting fluid, and also to envisage the economic factors in production planning
Pickering, W R
This book contains answers to all exercises featured in the accompanying textbook Science for Common Entrance: Physics , which covers every Level 1 and 2 topic in the ISEB 13+ Physics Common Entrance exam syllabus. - Clean, clear layout for easy marking. - Includes examples of high-scoring answers with diagrams and workings. - Suitable for ISEB 13+ Mathematics Common Entrance exams taken from Autumn 2017 onwards. Also available to purchase from the Galore Park website www.galorepark.co.uk :. - Science for Common Entrance: Physics. - Science for Common Entrance: Biology. - Science for Common En
This research is motivated by the incongruence of how businesses and contemporary research evaluate paid social media advertisement as online branding tools. Therefore, we examine the possibilities of social media marketing: why sponsored posts on Facebook and Instagram are effective online branding tools. A questionnaire was utilized to approach the research, and answer the hypotheses. Results from 316 participants indicated that sponsored posts were effective for brand awaren...
Snilstveit, Birte; Vojtkova, Martina; Bhavsar, Ami; Gaarder, Marie
Evidence-gap maps present a new addition to the tools available to support evidence-informed policy making. Evidence-gap maps are thematic evidence collections covering a range of issues such as maternal health, HIV/AIDS, and agriculture. They present a visual overview of existing systematic reviews or impact evaluations in a sector or subsector, schematically representing the types of int...
Jongeling, R.M.; Datta, S.; Serebrenik, A.; Koschke, R.; Krinke, J.; Robillard, M.
Recent years have seen an increasing attention to social aspects of software engineering, including studies of emotions and sentiments experienced and expressed by the software developers. Most of these studies reuse existing sentiment analysis tools such as SentiStrength and NLTK. However, these
Many of us nowadays invest significant amounts of time in sharing our activities and opinions with friends and family via social networking tools such as Facebook, Twitter or other related websites. However, despite the availability of many platforms for scientists to connect and...
Kuru Cetin, Saadet
In this study, in-class lesson observations were made with volunteer teachers working in primary and secondary schools using alternative observation tools regarding the scope of contemporary educational supervision. The study took place during the fall and spring semesters of the 2015-2016 and 2016-2017 academic years and the class observations…
Podhora, A.; Helming, K.; Adenauer, L.; Heckelei, T.; Kautto, P.; Reidsma, P.; Rennings, K.; Turnpenny, J.; Jansen, J.M.L.
Since 2002, the European Commission has employed the instrument of ex-ante impact assessments (IA) to help focus its policy-making process on implementing sustainable development. Scientific tools should play an essential role of providing the evidence base to assess the impacts of alternative
Aluja, Jaime Gil
Little by little we are being provided with an arsenal of operative instruments of a non-numerical nature, in the shape of models and algorithms, capable of providing answers to the “aggressions” which our economics and management systems must withstand, coming from an environment full of turmoil. In the work which we are presenting, we dare to propose a set of elements from which we hope arise focuses capable of renewing those structures of economic thought which are upheld by the geometrical idea. The concepts of pretopology and topology, habitually marginalized in economics and management studies, have centred our interest in recent times. We consider that it is not possible to conceive formal structures capable of representing the Darwinism concept of economic behaviour today without recurring to this fundamental generalisation of metric spaces. In our attempts to find a solid base to the structures proposed for the treatment of economic phenomena, we have frequently resorted to the theory ...
Pazos, Florencio; Chagoyen, Monica
Daily work in molecular biology presently depends on a large number of computational tools. An in-depth, large-scale study of that 'ecosystem' of Web tools, its characteristics, interconnectivity, patterns of usage/citation, temporal evolution and rate of decay is crucial for understanding the forces that shape it and for informing initiatives aimed at its funding, long-term maintenance and improvement. In particular, the long-term maintenance of these tools is compromised because of their specific development model. Hundreds of published studies become irreproducible de facto, as the software tools used to conduct them become unavailable. In this study, we present a large-scale survey of >5400 publications describing Web servers within the two main bibliographic resources for disseminating new software developments in molecular biology. For all these servers, we studied their citation patterns, the subjects they address, their citation networks and the temporal evolution of these factors. We also analysed how these factors affect the availability of these servers (whether they are alive). Our results show that this ecosystem of tools is highly interconnected and adapts to the 'trendy' subjects in every moment. The servers present characteristic temporal patterns of citation/usage, and there is a worrying rate of server 'death', which is influenced by factors such as the server popularity and the institutions that hosts it. These results can inform initiatives aimed at the long-term maintenance of these resources. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: email@example.com.
Verhagen, Evert; Voogt, Nelly; Bruinsma, Anja; Finch, Caroline F
Evidence of effectiveness does not equal successful implementation. To progress the field, practical tools are needed to bridge the gap between research and practice and to truly unite effectiveness and implementation evidence. This paper describes the Knowledge Transfer Scheme integrating existing implementation research frameworks into a tool which has been developed specifically to bridge the gap between knowledge derived from research on the one side and evidence-based usable information and tools for practice on the other.
Rhebergen, Martijn; Van Dijk, Frank; Hulshof, Carel
Many workers have questions about occupational safety and health (OSH). Answers to these questions empower them to further improve their knowledge about OSH, make good decisions about OSH matters and improve OSH practice when necessary. Nevertheless, many workers fail to find the answers to their questions. This paper explores the challenges workers may face when seeking answers to their OSH questions. Findings suggest that many workers may lack the skills, experience or motivation to formulate an answerable question, seek and find information, appraise information, compose correct answers and apply information in OSH practice. Simultaneously, OSH knowledge infrastructures often insufficiently support workers in answering their OSH questions. This paper discusses several potentially attractive strategies for developing and improving OSH knowledge infrastructures: 1) providing courses that teach workers to ask answerable questions and to train them to find, appraise and apply information, 2) developing information and communication technology tools or facilities that support workers as they complete one or more stages in the process from question to answer and 3) tailoring information and implementation strategies to the workers' needs and context to ensure that the information can be applied to OSH practice more easily.
Wallis, Selina; Cole, Donald C; Gaye, Oumar; Mmbaga, Blandina T; Mwapasa, Victor; Tagbor, Harry; Bates, Imelda
Research is key to achieving global development goals. Our objectives were to develop and test an evidence-informed process for assessing health research management and support systems (RMSS) in four African universities and for tracking interventions to address capacity gaps. Four African universities. 83 university staff and students from 11 cadres. A literature-informed 'benchmark' was developed and used to itemise all components of a university's health RMSS. Data on all components were collected during site visits to four African universities using interview guides, document reviews and facilities observation guides. Gaps in RMSS capacity were identified against the benchmark and institutional action plans developed to remedy gaps. Progress against indicators was tracked over 15 months and common challenges and successes identified. Common gaps in operational health research capacity included no accessible research strategy, a lack of research e-tracking capability and inadequate quality checks for proposal submissions and contracts. Feedback indicated that the capacity assessment was comprehensive and generated practical actions, several of which were no-cost. Regular follow-up helped to maintain focus on activities to strengthen health research capacity in the face of challenges. Identification of each institutions' strengths and weaknesses against an evidence-informed benchmark enabled them to identify gaps in in their operational health research systems, to develop prioritised action plans, to justify resource requests to fulfil the plans and to track progress in strengthening RMSS. Use of a standard benchmark, approach and tools enabled comparisons across institutions which has accelerated production of evidence about the science of research capacity strengthening. The tools could be used by institutions seeking to understand their strengths and to address gaps in research capacity. Research capacity gaps that were common to several institutions could be
Poster presented at the Research Bazaar 2015 at Melbourne University, Australia. Conference attendees were asked to share an overview of their project and the digital platforms they used in their research.
Bosma, W.E.; Theune, Mariet; van Hooijdonk, C.M.J.; Krahmer, E.; Maes, F.
In this paper we discuss and evaluate a method for automatic text illustration, applied to answers to medical questions. Our method for selecting illustrations is based on the idea that similarities between the answers and picture-related text (the picture’s caption or the section/paragraph that
Dahl, Jan Erik
In the studied master's course, students participated both as research objects in a digital annotation experiment and as critical investigators of this technology in their semester projects. The students' role paralleled the researcher's role, opening an opportunity for researcher-student co-learning within what is often referred to as…
Sturzenegger, Susi; Johnsson, Kai; Riezman, Howard
Funded by the Swiss National Science Foundation to promote cutting edge research as well as the advancement of young researchers and women, technology transfer, outreach and education, the NCCR (Swiss National Centre of Competence in Research) Chemical Biology is co-led by Howard Riezman, University of Geneva and Kai Johnsson, École Polytechnique Fédérale de Lausanne (EPFL).
Highlights: • A GUI-based intuitive tool for data format analysis is presented. • Data can be viewed in any data types specified by the user in real time. • Analyzed formats are saved and reused as templates for other data of the same forms. • Users can easily extract contents in any forms by writing a simple script file. • The tool would be useful for exchanging data in collaborative fusion researches. - Abstract: An intuitive tool with graphical user interface (GUI) for analyzing formats and extracting contents of binary data in fusion research is presented. Users can examine structures of binary data at arbitrary addresses by selecting their type from a list of radio buttons in the data inspection window and checking their representations instantly on the computer screen. The result of analysis is saved in a file which contains the information such as name, data type, start address, and array size of the data. If the array size of some data depends on others that appear prior to the former and if the users specify their relation in the inspection window, the resultant file can also be used as a format template for the same series of data. By writing a simple script, the users can extract the contents of data either to a text or binary file in the format of their preference. As a real-life example, the tool is applied to the MHD equilibrium data at JT-60U, where poloidal flux data are extracted and converted to a format suitable for contour plotting in other data visualization program. The tool would be useful in collaborative fusion researches for exchanging relatively small-size data, which don’t fit in well with the standard routine processes
... Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR (CONTINUED) RULES OF PRACTICE IN ENFORCEMENT PROCEEDINGS UNDER SECTION 41 OF THE LONGSHOREMEN'S AND HARBOR WORKERS' COMPENSATION ACT Prehearing Procedures § 1921.4 Answer. (a) Filing and service. Within 14 days after the...
Enables efficient assessment of pupils' performance at Levels 1 and 2 of the ISEB 13+ Common Entrance syllabus. Clear layout saves time marking work and identifies areas requiring further attention. Includes diagrams and working where necessary, to demonstrate how to present high-scoring answers in Level 1 and 2 exams.
Dijkstra, W.; Ongena, Y.P.
Interaction analysis was used to analyze a total of 14,265 question-answer sequences of (Q-A Sequences) 80 questions that originated from two face-to-face and three telephone surveys. The analysis was directed towards the causes and effects of particular interactional problems. Our results showed
National Inst. of Arthritis and Musculoskeletal and Skin Diseases (NIH), Bethesda, MD.
This fact sheet answers general questions about Marfan syndrome, a heritable condition that affects the connective tissue. It describes the characteristics of the disorder, the diagnostic process, and ways to manage symptoms. Characteristics include: (1) people with Marfan syndrome are typically very tall, slender, and loose jointed; (2) more than…
Meijer, R.R.; Sotaridona, Leonardo
Two new indices to detect answer copying on a multiple-choice test, S(1) and S(2) (subscripts), are proposed. The S(1) index is similar to the K-index (P. Holland, 1996) and the K-overscore(2), (K2) index (L. Sotaridona and R. Meijer, in press), but the distribution of the number of matching
Enables efficient assessment of pupils' performance at Levels 1 and 2 of the ISEB 13+ Common Entrance syllabus. Clear layout saves time marking work and identifies areas requiring further attention. Includes diagrams and working where necessary, to demonstrate how to present high-scoring answers in Level 1 and 2 exams
Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to…
In some math classrooms, students are taught to follow and memorize procedures to arrive at the correct solution to problems. In this article, author Mike Flynn suggests a way to move beyond answer-getting to true problem solving. He describes an instructional approach called three-act tasks in which students solve an engaging math problem in…
Employment Policies Inst. Foundation, Washington, DC.
This booklet, which is designed to clarify facts regarding the minimum wage's impact on marketplace economics, contains a total of 31 questions and answers pertaining to the following topics: relationship between minimum wages and poverty; impacts of changes in the minimum wage on welfare reform; and possible effects of changes in the minimum wage…
In connection with the intention of DWK to erect a fuel reprocessing plant in the Oberpfalz, citizens have asked a great number of questions which are of interest to the general public. They have been collected, grouped into subject categories and answered by experts. (orig./HSCH) [de
... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Answer. 17.9 Section 17.9 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL CIVIL MONEY PENALTIES HEARINGS... Dockets Management (HFA-305), Food and Drug Administration, 5630 Fishers Lane, rm. 1061, Rockville, MD...
Kademani, B. S.; Vijai Kumar, *
This paper highlights the information explosion, the need for bibliographic control, the need for information retrieval tools. Explains the emergence of Citation Index, concept of citation indexing, reasons for citing, its structure (print and electronic versions of Science citation Index and Social Science Citation Index ), and application of citation index. It also discusses the search effectiveness, factors taken into consideration for coverage of journals in citation indexes, Journal Cita...
Trevino, Victor; Falciani, Francesco; Barrera-Saldaña, Hugo A
Among the many benefits of the Human Genome Project are new and powerful tools such as the genome-wide hybridization devices referred to as microarrays. Initially designed to measure gene transcriptional levels, microarray technologies are now used for comparing other genome features among individuals and their tissues and cells. Results provide valuable information on disease subcategories, disease prognosis, and treatment outcome. Likewise, they reveal differences in genetic makeup, regulat...
Full Text Available Since the end of the 19th century the Calabria region in southern Italy has been known for an abundance of grooved stone axes and hammers used during late prehistory. These artefacts are characterized by a wide and often pronounced groove in the middle of the implement, thought to have aided securing the head to a wooden haft. Their widespread presence is known both in prehistoric archaeological literature and in the archaeological collections of various regional and extra-regional museums. At first, scholars did not relate these tools to the rich Calabrian ore deposits and to possible ancient mining activities; they were regarded simply as a variant of ground lithic industry of Neolithic tradition. However, between 1997 and 2012, about 50 tools were discovered in the prehistoric mine of Grotta della Monaca in northern Calabria where there are outcrops of copper and iron ore. This allowed us to recognize their specific mining value and to consider them as a sort of “guide fossil” for the identification of ancient mining districts. This paper presents the results of a study involving over 150 tools from the entire region, effectively demonstrating an almost perfect co-occurrence of grooved axes and hammers with areas rich in mineral resources, especially metalliferous ores.